top of page
Search
Steve

NAM Trainer v0.7.0 is released

I've just released a minor version bump of the NAM trainer. As always, you can update your local install from the terminal by typing:

pip install --upgrade neural-amp-modeler

What's new?

  1. The "Nano" preset architecture. In response to some asks for a more CPU-conscious offering than the "feather" configuration, this release makes official a pair of "nano" configurations for the WaveNet and LSTM architectures. The nano WaveNet is about half as CPU-intensive as the feather configuration, and the LSTM variant is about another 2x lighter, based on some tests on some more CPU-bound devices than my laptop.

  2. Models know their expected sample rate. One noteworthy limitation of the pre-built NAM offerings has been the need to stick to a 48kHz sample rate. This release explicitly notes that need in the model artifacts. The intent is that, by making this an explicitly-tracked piece of information in the models, the possibility opens up for running models in different native sample rates in the future. This is a separate matter from resampling within the plugin so that e.g. a DAW session running at 44.1 kHz can correctly use a 48 kHz NAM model. (That's under development; sit tight!)

  3. [breaking] Removal of the "advanced" Colab Jupyter notebook. In order to streamline the many ways that models can be trained, I've removed the more full-featured "colab.ipynb" from the repo. The more popular "easy_colab.ipynb" is still there, and the wide majority of users who are used to training models in their browser will be unaffected by this deprecation. Power users may either move to using the Python script bin/train/main.py, which will continue to support the most complete offering of functionality, or are welcome to keep their own copy of the notebook and maintain it themselves.

Enjoy!

1,283 views

Recent Posts

See All

Plug-in version 0.7.11 is released

I've released a new version of the NAM snapshot plug-in. You can download it on the users page . This release fixes a bug that users were...

bottom of page