top of page
Search
  • Steve

NAM Trainer v0.7.3 is released

I've just released a new version the NAM trainer. As always, you can update your local install from the terminal by typing:

pip install --upgrade neural-amp-modeler

What's new?

  1. The v3_0_0.wav reamping file. I've reworked the standardized reamping file. The new version still allows the trainer to check that the reamp went well, but gets rid of an extra check that was often responsible for false alarms that resulted in a lot of user questions. I've also turned off the "hard fails" associated with those checks, but left warning messages. Mosdt users should find that the training process succeeds far more often. If you've been reflexively turning on the "disable checks" option every time you train, you may find that's no longer needed.

  2. New "pedal_amp" metadata gear type. If you've been modeling things like overdrives and amps together, there's now a standardized metadata option for that.

  3. An extendible model architecture registry. This one's going to be more exciting to folks coding with NAM than users. With this release, it's now possible to create brand-new NAM architectures and add them into the trainer without modifying this package. Now, you can code up your architecture, register it in NAM['s trainer], and use it immediately--no reinstallation required. My hope is that this will allow others building on top of NAM to quickly iterate on new ideas and do new things without worrying that their code will diverge from the repos that I'm maintaining. This is also how I expect parametric ay still be supported in the future while they're removed from the core package itself in an effort to simplify around the most common use cases.

Enjoy!

905 views

Recent Posts

See All

NAM Plugin version 0.7.9 is released

I've released a patch for the NAM snapshot plugin. Download it on the users page. The full changelog can be read on GitHub. Nothing much to say except this release focuses on a bug fix for the resampl

Towards a (good) CPU-efficient NAM

One of the most common problems people have with NAM is that the standard WaveNet architecture (the default in the simplified trainers and therefore the most common) is too compute-hungry to work in a

bottom of page