This must be something very basic I’m not understanding so apologies for that, I’m pretty new to C++.
I get that you can pass START_JUCE_APPLICATION a subclass of JUCEApplication, and that gets a main loop going somewhere else I’m not seeing, as in the AudioAppExample example, but in the JuceDemoPlugin example, there’s no Main and there’s no macro that’s explicitly handling it instead. So… Where’s the actual entry point to the JuceDemoPlugin, I suppose I’m asking, and where is the stuff that’s declared in its PluginProcessor and PluginEditor files actually called?
It’s a plugin, so the main is actually run in the app running the plugin.
Plug-Ins are DLLs* (Dynamic Loaded Libraries).
That’s why AU, VST, AAX etc have SDKs / APIs. which are basically standards of where to expect things happening.
(the host would try to load your dll and call functions just as you do in your code which it expect you to provide/implement).
common DSP workflow is based on loops so basically you have a unit that keeps doing the same thing over and over again. (the internal implementation decides what past data it should hold or even future when delay-compensation / extra buffering involved).
Common callbacks for audio are:
- creation (which any object oriented language have, this is a constructor that first creates the object)
- prepare to play - which is where you would usually clean-up your processing first time or due to changes (such as samplerate, etc…)
process - the holy grail callback being called over and over again expecting you to be continuous in the audio signal you’re providing.
JUCE is cross-platform, cross-format. so it has wrappers which gets your basic JUCE API and connect it with the proprietary API of the specific format.
Most if not all audio plug-ins separates the view and the actual process.
They should of course must be separated and concurrent. to provide continuous audio.
So the PluginProcessor is your DSP and the PluginEditor is your view.
JUCE code is wide-open which is very helpful, the best way to get some understand of it would be hoping between classes and understand how they’re being used.
And the third tile, to make it a picture:
The AudioProcessorEditor is embedded in a window owned by the host, so GUI events (and event loop) will be driven by that window.
I was going to try and add something to the explanation. But I think the best thing you could do is to build the demo plugin and put a debugger breakpoint at the start of each function. Then you can just look and see what’s happening…
Note what the calling function is, and also the thread it’s called on.
Here’s an example, you can see my editor constructors is called on the Main Thread from the VST API (i.e. by the DAW which in this case is the JUCE plugin host):
Ah! OK yes I think I understand. The host runs the VST SDK, which JUCE’s AudioProcessor sortof wraps around, and then anything that adheres to the AudioProcessor protocol gets called too?
Thanks very much everyone