As Daniel says, VST3 sends parameter changes to the process callback in IAudioProcessor through a ProcessData argument. The host informs the EditController separately through setParamNormalized (on the message thread). When a GUI element changes, the EditController informs the host through performEdit (on the message thread), then the host sends these changes to the process callback through ProcessData. Processor and controller are intended to be able to run on different devices, so they are detached, and the host synchronizes them. Jules has talked a lot about this separation, and how it will be essential in the foreseeable future, so it’s rather strange that JUCE actually steps back from VST3 here. That’s why I suspect the other formats -I’ve read that AAX, for example, handles parameters on separate, non-audio non-message threads.
What do you mean by "Message Thread ? I used the term “GUI” and “audio” thread, but that might be inappropriate.
Nonetheless, I assume that sample accurate Parameter changes need to be received in the same thread as Audio and Midi us received and sent. Otherwise the sample blocks can’t be determined/handled.
Moreover I assume that non sample accurate parameter changes might come in the GUI thread.
Yes, in this context message thread = GUI thread. In VST3, all parameter changes come to the process callback through the same channel, the ProcessData argument. It contains a queue of parameter indexes, parameter values and sample offsets (aka timestamps). The fact that they can be sample accurate is additional -changes started by the UI (the EditController) also reach the process through this queue, with whatever timestamps the host decides. To mimic this behavior in JUCE, there should be different parameter interfaces for processor and editor, instead of a single atomic for both. In JUCE, a GUI change is immediately stored in the attached AudioProcessorParameter and notified to all listeners, and when the value comes again through the process callback, the notification is omitted, as it’s been set already.
I really don’t understand why Midi messages (e.g. CC) and DAW parameter changes should / are handled differently. For rendering to audio (e.g. with a VSTi), there should be no algorithmic difference if a parameter change comes as a Midi CC or as a (sample accurate) DAW parameter modulation.
Welcome to the VST3 architecture committee.
(spoiler, VST3 abstracts away MIDI CCs such that they are presented to the Processor as sample-accurate parameter events). i.e. MIDI-CCs and Parameters are literally the same thing.
There was a discussion in the Reaper forum about on which thread (the track’s audio thread or Application GUI Main Thread) parameter changes are pushed to a VST via the VST API.
We could not find a full answer.
A result seemed do be that with VST2 it always is the GUI thread. Even when the parameter changes are initiated by Midi CCs on the same track (which Reaper allows by the “Midi Link” feature, even with High Resolution Midi CC) Reaper forwards them to the Main Thread. It seems obvious the VST3 “sample accurate” parameter changes need to be provided on the Track’s audio thread (together with the timed Midi events) to allow for assigning them to the correct sample block.
With VST3 there is a lot of confusion. There might be several cases:
“VST2 style” or “sample accurate” parameter changes
parameter changes generated from track’s Midi CCs or generated otherwise (within Reaper from the Main Thread)
does VST3 even feature receiving Midi CC events in the normal way ? (I tested that Juce does this transparently for the user code.)
what if the VST does not support “sample accurate” parameter changes (such as the current JUCE version) ? Will it notify the host and hence not receive those or will it need to do some conversion internally.
Can somebody with deep knowledge about the VST API provide some clarification ?
what if the VST does not support “sample accurate” parameter changes?
Parameter changes do work in JUCE VST3 plugins. It’s just that they are ‘dumbed down’ to behave like VST2 parameter changes. i.e. you will lose the thread-safety aspect of VST3 and have to cope with ‘parameterValueChanged’ being called from any thread. I’ve noticed it being called from both the GUI and Audio threads in Cubase for example.
does VST3 even feature receiving Midi CC events in the normal way ?
natively VST3 supports MIDI CC events by presenting them as sample-accurate parameter changes. Of course if you’re using a wrapper like JUCE, these events will be converted back to MIDI for the JUCE plugin.
Of course I do know that Juce does work fine with CCs as well as VST2 as as VST3. I did do a testing program that can be compiled to as well 2 as 3 and does work as expeced, sending and receiving as well parameter changes as Midi CCs. (But not involving any “virtual realtime” syncronicity with audio.
Yep. But this does not help me understanding the details.
Also with Reaper you can explicitly create Parameter changes in a “non-timed” fashion (like VST2) (e.g. by pressing a button in the GUI) and in a timed fashion (e.g. by “envelopes”). Moreover you can create parameter changes that are explicitly in syn with the track’s audio (by “Parameter Midi Link” which is supposed to synchronously convert Midi CC messages to Parameter changes). While with VST2 it’s rather clear that the synchronization will be sloppy, I suppose the synchronous changes with VST3 are sent as sample accurate parameter changes (what ever this exactly means).
I suppose with a decently crafted VST3 plugin, it should make no difference in the output whether Midi CCs are sent directly to the plugin or if they are converted to parameter changes by “Parameter Midi Link” .
I explained this before, you can read a synopsis here. A VST3 plugin is two separate entities: an AudioProcessor and an EditController. The host sends and receives parameter changes from the AudioProcessor through its process callback, specifically through the inputParameterChanges and outputParameterChanges queues in its ProcessData argument. This is what you call sample accurate changes, but the point here is that they’re transmitted on the audio thread, for the simple reason that they’re arguments of the process callback. The EditController informs the host of changes coming from the UI using beginEdit / performEdit / endEdit -these are invoked by JUCE’s beginChangeGesture / setValueNotifyingHost / endChangeGesture. The host informs the EditController of changes coming from the host using setParamNormalized. This is the important thing: the host informs the processor through the audio thread and the editor through the UI thread, separately. A change started in the host is informed to the processor through the process callback, and also to the editor through setParamNormalized. A change started in the UI is informed to the host through begin/perform/endEdit, then the host passes it to the processor through the process callback. Processor and editor don’t communicate directly, only through the host. Everything UI is UI thread, everything audio is audio thread: when something involves both, it’s done separately for each one.
Regarding Parameters Host → plugin (rendering or real time performance), I (hopefully correctly) understand:
Audio thread: The plugin’s AudioProcessor is called by the host with any sample block (maybe multiple times with different data types). Here audio and Midi data are transferred with a “virtual realtime” according to the Audio block counter. Samples timing is obvious. Midi messages have timestamps relative to the current audio block. With VST3 (but not with VST2) parameter changes are transferred in a similar way as Midi events. JUCE handles this as expected for Audio and Midi. with VST3 parameters it converts the data to the not-timestamped VST2 format, but still on the audio thread (or even not: see above).
GUI thread: The plugin’s EditController is call by the host whenever appropriate with a timing not decently correlated with the virtual Audio Block timing. With VST2, parameters are transferred here. With Automation for VST2, Reaper will try to do the appropriate calls as close as possible to the AudioProcessor call correlated with the automation. But as these are different threads and the GUI thread has rather low priority, this is rather sloppy. With VST3, parameters are not transferred on the GUI Thread.
Hence for the JUCE user the parameter transfer is identical for VST2 and VST3, but they can’t implement sample accurate parameter modulation. If one day JUCE might support sample accurate parameter modulation, the user will need to provide code for both ways, if they want the project to be compilable for both API versions.
VST3 does not receive MIDI CC directly. You map CC to a parameter, and the DAW updates that parameter instead (*sample-accurate). You can then convert that parameter update back to MIDI if you need to.
*Unfortunately, JUCE loses the sample-accurate timestamp on parameter updates, so your emulated MIDI CC is no longer accurate either.
I think someone mentioned that JUCE is working on a kludge to get CCs to work?
For the second question, yes VST3 supports program change.
As far as I know, JUCE does maintain the sample position with MIDI CC.
The ‘lossy’ part in JUCE that loses the sample position is the AudioProcessorParameter API, but they convert the MIDI CC in the wrapper to the MidiBuffer, which does have the accurate sample position.