I’m trying to host plugins on my daw so that users will be able to add them to tracks and edit them on a separate window.
after some research on the forum, docs, and the PluginHost provided by JUCE, I understood that it’s something to do with the AudioProcessor, and the AudioProcessorEditor. but I’m not really sure how to use them.
so a point in the right direction would be really appreciated.
EDIT: I know what’s the purpose of AudioProcessor, but don’t know how to use it for plugin hosting.
With those pointers I would look again into the AudioPluginHost as reference.
The returned AudioPluginInstance is an AudioProcessor but you don’t see the juce code. It is an AudioProcessor seen through the plugin API like VST3 or AU or AAX.
You use the PluginDescription that your scanning returned to create an instance of the plugin.
Then you need to prepare the AudioProcessor by calling prepareToPlay() and then you can process the audio using the processBlock() method.
Is the “instance of the plugin” the AudioPluginInstance class? and if so, can I connect it to an AudioProcessorGraph so the plugins would be chained one after the other?
How can I get the AudioProcessorEditor of the plugin?
sorry if I’m not really understandable, I’m not a native English speaker…
The plugin is stateful, so everywhere you want to use that plugin, you need an instance.
You can use the AudioProcessorGraph. I would only do that if you want that type of host like the AudioPluginHost, where you connect the different plugins in random orders.
If you want a Cubase style chain of plugins, I would simply use a std::vector<std::unique_ptr<juce::AudioPluginInstance>>
When you playback I guess you use some AudioSources at the moment. What you need to do is to get an AudioBuffer<float> with the audio data and a MidiBuffer (you can use an empty one to begin with, until you get to use instruments). Now you can sequentially call processBlock on all instances:
// in getNextAudioBlock()
auto proxy = juce::AudioBuffer<float>(info.buffer->getArrayOfWritePointers(), info.buffer->getNumChannels(), info.startSample, info.numSamples);
juce::MidiBuffer midi;
// todo: set current autimation values (optional)
// todo set position on juce::AudioPlayhead (optional)
for (auto& plugin : plugins)
plugin->processBlock (proxy, midi);
If you need an editor, you create a window (Component and addToDesktop() and setVisible (true)) and give it the child the AudioProcessorEditor by calling
plugin->createEditor();
Remember, AudioPluginInstance inherits AudioProcessor, so the createEditor is available. Only methods from your specific AudioProcessors are not available.
Great. Btw. I forgot, we didn’t talk about channel counts and such stuff. In my host I fixed all for now to stereo, but you should actually probe isChannelLayoutSupported() with the setup you want to set, and finally call setBusesLayout().
I’m most likely going to fix it to stereo too.
another question. Does the plugin’s parameters state is saved automatically? or do I need to manage it via XML or something?
You need to save it yourself by calling getStateInformation() on the plugin instance. You get a binary blob returned in a juce::MemoryBlock, that you can embed into your project.
If your project is in XML or a juce::ValueTree, most people would use base64 encoding.
Base64 has some overhead but can easily be converted back and forth.
And when loading a session you set that binary blob back using setStateInformation.
eyal23, Daniel.
When you get around to actually enabling all of the buses from a plugin within your DAW please take a look at this post
I poke at my issue every once in a while but then get so frustrated I go back to coding other things. I am sure it is something simple I am missing.
Yes, most plugins only expose a single stereo bus, but other programs like virtual drum kits (SSD4, SSD5 anything ToonTracks) and Orchestral suites (IkMultiMedia Philharmonik 2, Spitfire,…) have 16-24 stereo buses (32-48 ‘channels’) and a few mono auxes.