I explained this before, you can read a synopsis here. A VST3 plugin is two separate entities: an AudioProcessor and an EditController. The host sends and receives parameter changes from the AudioProcessor through its process callback, specifically through the inputParameterChanges and outputParameterChanges queues in its ProcessData argument. This is what you call sample accurate changes, but the point here is that they’re transmitted on the audio thread, for the simple reason that they’re arguments of the process callback. The EditController informs the host of changes coming from the UI using beginEdit / performEdit / endEdit -these are invoked by JUCE’s beginChangeGesture / setValueNotifyingHost / endChangeGesture. The host informs the EditController of changes coming from the host using setParamNormalized. This is the important thing: the host informs the processor through the audio thread and the editor through the UI thread, separately. A change started in the host is informed to the processor through the process callback, and also to the editor through setParamNormalized. A change started in the UI is informed to the host through begin/perform/endEdit, then the host passes it to the processor through the process callback. Processor and editor don’t communicate directly, only through the host. Everything UI is UI thread, everything audio is audio thread: when something involves both, it’s done separately for each one.
