VST3 parameter changes from the DAW notify the plugin on both threads. The GUI gets notified on its own thread, and the Process gets notified on the audio thread. Updates on the audio thread each have a timestamp to identify when they should happen.
In the JUCE VST3 implementation, parameter updates from the host are made on the audio thread, for the most part. Sample-accurate automation is not currently supported, so there will be a maximum of one change notification per parameter per block.
Note that JUCE doesn’t make any guarantees about which thread will make parameter change callbacks. Parameter change callbacks must be wait-free, in case the call is made from the audio thread.
Perhaps in the future, it would be good for JUCE to support thread-safe callbacks. Since VST3 and AU3 work that way already. This would remove the need for all the ad-hoc locks and atomics in JUCE plugins. I guess it’s only the need to support VST2 that is holding us back?
This seems kind of weird and maybe not easy to handle. I had expected that there would be different callback functions for the different threads, so that we know what to do in either case. Plus some means to know if there might be both.
But maybe there is a simple means to ask which thread the callback is running on.
Do you have example code for this ?
I agree that the API isn’t ideal, but it’s also not straightforward to fix in a way that is backwards-compatible.
You can use
MessageManager::isThisTheMessageThread() to find out which thread is calling the function.
You can use
MessageManager::isThisTheMessageThread()to find out which thread is calling the function.
I suppose this should suffice. Maybe some warning / documentation / example needs to be provided.
Or is sample accurate parameter modulation near and will make this obsolete ?
If JUCE will support sample accurate parameter automation then please make it optional! I’m fine with once-per-block, most stuff is smoothed anyway
the timestamp on sample accurate events are not mandatory, more like a ‘hint’, it’s quite normal for you to decide that some parameters are not so critical that they need to be sample-accurate, and to ignore the timestamp instead you can choose to update these on a per-block basis.
We don’t lose any control by having more information on parameter callbacks/events.
I suppose, sample accurate parameter automation will not generate more callbacks, but (1) do the callback in the audio thread, still providing a list of parameter changes and (2) provide a time stamp with any of these events. (this makes sense, as the audio samples also come in a callback per block).
Parameter changes can come from various sources. If it comes from a knob in your plugin GUI you can be lucky if it arrives even in the block you are hearing at the moment. It is totally dependant on the current system load on the low priority message thread.
Changes from a hardware controller might be a bit better, at least it’s a dedicated thread processing the MIDI, USB, OSC or whatever port.
But that’s all the hosts domain to sort out, nothing you can do and nothing JUCE can do.
I don’t know how other formats handle parameters, but VST3 has a complete separation between audio and message thread notifications. This is handled by the host, so the plugin doesn’t really need to synchronize them. The JUCE API undoes this by merging all notifications in the listener callbacks. I’ve assumed this is caused by some of the other formats not having this separation.
I meant DAW parameter changes via the VST API. only those can be “sample accurate”.
I understand that legacy (non sample accurate) parameters usually are pushed through the API by the DAW on the GUI thread, independent from sample blocks, while sample accurate parameter changes need to be pushed on the audio thread (together with a sample block) similar as Midi Data, which also provide a time stamp (virtual time offset regarding the begin of the block).
Well, since the topic is “notifying the host about AudioProcessorParameter changes”, I assumed the opposite direction. Your use case would be “Getting notified about parameter changes by the host”
In that direction it is much simpler getting the changes with the processBlock call, where a timestamp actually is synchronised with the samples.
Samples are not “synchronized”. Only sample blocks are synchronizes to a virtual time base (one block of defined length after the other), but not at all to the “wall clock” time. Mind that DAWS do “offline rendering” with the virtual time running much faster (or sometimes much slower) then real time.
Hence (Midi data and) DAW introduced sample accurate parameter changes need to be associated to a sample block and each denoted by a time offset.
Well, that’s what I meant, the samples are synchronised to the in-block timestamp, as it works in the VST3.
There is not a single clock time. There is processing time, presentation time, wall clock…
I think we mean the same.
Yep ! (edited my previous message slightly)
Hmmm. I don’t know the VST3 specification, but AFAIU, samples, Midi data and sample accurate DAW parameter changes, all need to be synchronized with one another as all need to be accepted as as “virtual immediate” input to the audio engine (e.g. a physical model of some complex device) at well defined points of the virtual time flow.
The only decent way I can see how that should be done is providing a block of samples (a fixed count of those), a block of Midi data, and a block of DAW parameter changes (which both need a time stamp for each event).
That part is pretty simple like I said above. All necessary data for the processing is in this data structure ProcessData
Problematic is the interaction from GUI to host (IEditController).
The JUCE processor however puts those both into one class (SingleComponentEffect, I haven’t checked the sources but effectively it is like using that one).
In a classic VST3 effect all message thread would end up in the controller, but in JUCE all notifications are mixed together, at least that’s how I understand it.
As Daniel says, VST3 sends parameter changes to the process callback in IAudioProcessor through a ProcessData argument. The host informs the EditController separately through setParamNormalized (on the message thread). When a GUI element changes, the EditController informs the host through performEdit (on the message thread), then the host sends these changes to the process callback through ProcessData. Processor and controller are intended to be able to run on different devices, so they are detached, and the host synchronizes them. Jules has talked a lot about this separation, and how it will be essential in the foreseeable future, so it’s rather strange that JUCE actually steps back from VST3 here. That’s why I suspect the other formats -I’ve read that AAX, for example, handles parameters on separate, non-audio non-message threads.
What do you mean by "Message Thread ? I used the term “GUI” and “audio” thread, but that might be inappropriate.
Nonetheless, I assume that sample accurate Parameter changes need to be received in the same thread as Audio and Midi us received and sent. Otherwise the sample blocks can’t be determined/handled.
Moreover I assume that non sample accurate parameter changes might come in the GUI thread.