top of page
Search
  • Steve

The first publicly-available parametric Neural Amp Model

Today, I'm releasing ParametricOD, a plugin that uses NAM's parametric modeling capabilities to give you a model of my overdrive pedal that is accurate across the full range of the pedal's knobs and switches.



GUI for the plugin.


The plugin is available in VST3 and AU formats for macOS and VST3 for Windows with similar compatibility to the open-source snapshot plugin ("the NAM plugin"--or perhaps just "NAM"--to many users).


This plugin is intended as a "concept" plugin in the sense that I want to use it to address some potential misconceptions as well as to demonstrate some existing capabilities of NAM that may not be known to many people:


NAM isn't just a "snapshot" modeler.

Since NAM was first introduced in 2019, there have been a lot of data-driven modeling products that have come onto the scene in the guitar space. Many (Kemper's profiling, Neural DSP's Neural Capture, TONEX's Tone Modeling, Headrush's Smart amp/pedal cloning, and Tonocracy's ToneSnap) have focused on emulating the tone of the gear at a single "snapshot", leading to the impression that neural methods aren't capable of modeling the effect of moving the knobs and switches on real gear.


However, this isn't accurate. Folks familiar with NAM might remember that I demonstrated the use of NAM to make a 7-knob parametric model here:




Also, NAM isn't the only project to announce this capability. For example, Proteus by GuitarML supports "knob capturing". However, my hope is that this plugin, with the help of NAM's visibility, helps make folks more aware of what some of the possibilities are.


Parametric modeling doesn't require impossible amounts of data

A related misconception is that it is practically impossible to collect enough data to make a model like this. For a single-knob model (e.g. of the "drive" knob), one could imagine sweeping the knob fro 0 to 10 in increments of 1, requiring a total of 11 reamps. With the standard reamping file I've provided for NAM, this could be done in under an hour. However, to do this for 2 knobs, one might imagine that they would have to do all combinations of the knobs, making for 11x11=121 reamps. For this model, which has two knobs and two switches*, this logic would suggest that I ran almost 500 reamps, recording over 24 hours of audio.


One way around this is to reduce the number of points--instead of increments of 1, I could do increments of 2 (0,2,4,6,8,10) and reduce the number of points by a factor of about 4 overall. But this is a losing game, since adding one more knob multiplies the work by a factor (of 11, or 6, in this example.) With only 2 values per knob (min, max), the 7-knob model above would have still taken over 100 reamps (and might have pretty dubious accuracy interpolating between those extremes!) This challenge has a name: the curse of dimensionality.


Since that's a really big problem, there's been a lot of work to fix it, falling largely under the scientific field of optimal experimental design. It's beyond the scope of this blog post to get into the details, but the punchline is that using some advanced methods from this field allowed me to trim the time I spent (including the time spent moving the knobs between reamps) to just over an hour. Work smarter, not harder!


NAM isn't intrinsically CPU-heavy

Users should notice that the CPU load of this plugin is far lower than they experience with many snapshot models. This was done by using a lighter neural network architecture in order to save on CPU while still reaching "NAM-level" accuracy.** This feeds into the last theme from this project...


NAM is customizable

It's very hard to point at something and claim that "NAM can't do that"--it's built in a way that purposefully sets it up to solve all sort of problems beyond snapshot modeling. The recent features for dataset and model registries are meant to supercharge this--if you want to customize the models, then here's the way in! I took advantage of this to customize the model architecture specifically for pedal modeling (this also required some custom C++ code for running the model in the plugin), but the resulting Neural Amp*** Model is based on the same open-source framework as the standardized tools that are in wide use, and it was thanks to the open-source repositories that I was able to make this customization and get "NAM-level" results quickly.


Conclusion

I've heard an oft-repeated line that "captures can't model the knobs" or, perhaps more encouragingly, that this would be "the next frontier." It's been difficult for me to navigate how to go about sharing this capability with the world, but after having had it for over a year, I'm happy to finally demonstrate it in a free plugin. From the start, my aim with sharing NAM was to provide a resource that can be used to advance the state of the art in guitar effects and what is available for musicians to use to create their art. With this plugin, I hope that others will be inspired to follow in this direction and continue pushing the boundaries forward. For those who are with a serious interest in building their own parametric models with NAM, I can be contacted at neuralampmodeler@gmail.com.


Enjoy!


*I didn't model the output knob for this pedal because its potentiometer is damaged and behaves erratically. [Back]

**On my Macbook, ParametricOD has a CPU load of 0.04 cores, while a Standard WaveNet snapshot model takes 0.20c (5x more). The ESR of the model was measured to be below 0.004 on an audio signal and a knob setting not shown to the model during training. [Back]

***Err, pedal [Back]

10,160 views

Recent Posts

See All

NAM Plugin version 0.7.9 is released

I've released a patch for the NAM snapshot plugin. Download it on the users page. The full changelog can be read on GitHub. Nothing much to say except this release focuses on a bug fix for the resampl

Towards a (good) CPU-efficient NAM

One of the most common problems people have with NAM is that the standard WaveNet architecture (the default in the simplified trainers and therefore the most common) is too compute-hungry to work in a

bottom of page