You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This example processes an LSTM Neural Network Model trained using Steve Atkinson's Neural Amp Modeler. The C++ code to run this model is found in LSTMModelInference.h. The model itself is in ort-builder/model.onnx, and converted to .ort format and serialized to a bin2c resource in ort-builder/model/model.ort.h. The project links to customised ONNX Runtime static libs which are pruned to contain only the operators and types required for a particular model, only including support for inference using the CPU. ORT is linked statically to make the audio plug-in more portable. These libs are created with a separate repo ort-builder, which can be used to customize the libs for your model, and to add support for e.g. GPU inference.
It should compile for macOS, iOS and Windows.
For Windows, you'll need to unzip the onnxruntime.lib in /ort-builder/libs/win-x86_64/MinSizeRel. If you need to build the Debug target, you'll need
to compile the debug build of onnxruntime.lib (not included due to its size).
License: MIT
About
ML Audio plug-in example using iPlug2 & ONNX Runtime