top of page
  • Steve

Two notes Audio Engineering Announces GENOME

Yesterday at NAMM 2024, Two notes announced GENOME, a full-featured guitar and bass plugin. I'm particularly excited for its CODEX Amp block, which includes integration with NAM, meaning that users can enjoy their favorite Neural Amp Models within a signal chain using Two notes' other tone shaping tools (TSM, DynIR, and a healthy list of other must-have guitar and bass effects) and craft their full tone inside the GENOME plugin.

Check it out here: GENOME | Two notes

The CODEX Amp block inside GENOME, playing a Neural Amp Model with an overdrive in front and a Two notes DynIR cabinet block afterwards.

Folks will also be excited to see that Two notes has included a lot of handy features inside CODEX that make it easier to customize the tone of NAM: selectable input voicings (warm, neutral, bright); a bass-middle-treble tone stack with different voicing options and the option to place it before or after the model, high- and low-pass filters and a graphic EQ with either guitar, bass, or customizable frequency bands; and an enhancer section to make the tone "pop" just a little more. Two notes have also worked with NAM veteran and Nathan Mesiti of Arlington Audio to get you started with a few NAMs.

Playing around with the features, I find them pleasantly musical and I can see them as really useful ways to put just a little extra polish on a model either because you got the model from someone else and want to make it your own, get it to sit more nicely in your mix, or just to take your real amp and do a few "moves" with it that aren't possible in the real world.

And CODEX doesn't just play NAM--it also supports models from familiar open-source projects GuitarML and AIDA-X. If you've got a few favorites from each, you'll be able to work with them seamlessly all in one unified interface.



Recent Posts

See All

NAM Plugin version 0.7.9 is released

I've released a patch for the NAM snapshot plugin. Download it on the users page. The full changelog can be read on GitHub. Nothing much to say except this release focuses on a bug fix for the resampl

Towards a (good) CPU-efficient NAM

One of the most common problems people have with NAM is that the standard WaveNet architecture (the default in the simplified trainers and therefore the most common) is too compute-hungry to work in a


bottom of page