Sorry for the creation of this new topic, I know that someone surely already has posted a similar question but I don’t find nothing in the form with my searches…
Can someone help me to understand how to instantiate an au or a vst in my processor? I’d like to have a starting point to understand basically how instantiate the processor and the processor editor of it.
For a crude test with a VST2 plugin, you can try something like the following code, but it obviously isn’t going to work for a real life project (you definitely want to be using the Juce facilities that scan and manage the plugins for you) :
// put this maybe in your processor's constructor
AudioPluginFormatManager plugman;
plugman.addDefaultFormats();
PluginDescription desc;
desc.fileOrIdentifier = "C:\\Program Files\\VST_Plugins_x64\\ValhallaRoom_x64.dll";
desc.pluginFormatName = "VST";
desc.uid = 0;
String error;
m_plug = plugman.createPluginInstance(desc, 44100.0, 512, error);
Where std::unique_ptr<AudioPluginInstance> m_plug; is a member of your processor.
In your createEditor method (if this is an AudioProcessor class), you may be able to just do the following to get the plugin’s editor :
return m_plug->createEditorIfNeeded();
If in addition you need your own GUI elements, it’s going to be more complicated to manage. (I would guess you will want that, otherwise, what would be the point to host another plugin in a plugin…?)
That all is missing various error handling and other details. (That code is also based on the latest Juce develop branch code…) This likely won’t work for AU plugins by using the plugin file path because AU plugins need to be handled via their system IDs. (Which is why you should be using the Juce plugin scanning/managing classes.)
It’s also a pretty complex full application, one isn’t going to easily figure out from that code how to simply instantiate and use a single plugin.
Which reminds me, Juce should probably have a very simple and straightforward example of how to host a plugin, since this topic does come up quite regularly…
In true OOP fashion, you’ll need to assemble the various components needed to scan and load, list, and instantiate plugin instances. The AudioPluginHost provides the process and a basic way of going about.
In short, you need to lay out instances of these classes and get them to work together:
You will have to populate the KnownPluginList by scanning for plugins in their usual locations. You can use an instance of PluginListComponent to do that heavy lifting for you. Just a note: the locations, or file paths, aren’t standardised whatsoever so you’re up for some guess work. This is alleviated by each plugin format wrapper provided with JUCE (search for getDefaultLocationsToSearch() to give you a sense of what that means).
Then, you will have to tell the AudioPluginFormatManager which PluginDescription you want to (attempt) instantiating. This is an extremely fickle process with third party plugins - you really need to be sure you stick to doing this on the main thread.
Obviously there’s more to it, like if you want to process the audio, and pass that audio to a device then you’ll need to make use of the AudioDeviceManager and some way of calling processBlock on the plugin, like you would with an AudioProcessorPlayer
You can use the AudioProcessorGraph to connect instances of your plugin together, as you would in a chain.
If you want to simulate or fully configure a means of syncing with the DAW’s time and/or tempo and/or time-signature (among other properties), you can play around with AudioPlayHead. The AudioPluginInstance created by the AudioPluginFormatManager will need to be told about the playhead (see AudioProcessor::setPlayHead).
Let me know if I’ve missed anything or complicated anything. I’ll happily edit this post.
For some plugins you will have to do the BusesLayout negotiation, i.e.
calling isBusesLayoutSupported() with your proposals,
calling setBusesLayout() for the chosen one,
calling prepareToPlay() for the plugin to adapt to the layout and
make sure to have enough channels in your processBlock().
That’s important for multi channel/surround but as well as for side chains (I have at least one Waves Compressor, that doesn’t run in stereo in/stereo out default).
After a lot of experiments I think I understand how scan system and after instantiate a plugin on desktop version! thank you to all again!
Now my problem is on iOS (my app in fact is for mobile not for desktop, I start with desktop experiments to understand how the logic is).
Applying the same procedure that I use in desktop are founded only apple plugins (as I posted here: Unable to find AU3 on ipad?) and, thinking that the problem is that AU3s are treated asyncronously (tell me if I’m wrong please), I spend a lot of time trying to understand how work with ThreadPool and ThreadPoolJob to scan AU3 (I clone for example for some experiment AUScanner struct of AudioPluginHost), but results confusing me… I don’t know when the scanner finish his work to log the AU3s founded and some time some plugins are not founded (unfortunately I don’t undesrtand the logics of this).
Moreover, when some plugins are founded and I try to instantiate them if I call “createPluginInstance” method and print the error string the log say that the description of plugin doesn’t correspond to any plugin and calling “createPluginInstanceAsync” method, sometimes it fails in the same way (log say that the description of plugin doesn’t correspond to any plugin) and when it doesn’t fail my app seems to block waiting for something (for example I have 2 textButton and at this time I can’t see them anymore while all screen becomes grey)…
Please help me, this is really out of my current abilities…