you’re not thinking far enough here. yes, you might still need to smoothen the parameter changes coming from the GUI, but imagine how cool it would be if at least the motions that come from outside of the plugin could be unsmoothed? like you could have a sample-accurate temposync LFO modulating anything in a plugin, or an envelope follower where quick changes in the modulation signal are often desirable.
stamped events is the way to go then imo. i don’t think automation will be the main thing people use to modulate parameters in the future anymore. imagine modulators constantly doing something with plugin parameters. that would mean that the cpu can never take a break if it is solved by making the buffer small. and if that happens on almost every track in the project you’d need a super computer to finish any reasonably complex piece of music.
edit: i just kept on reading the rest of the post and saw you actually considered most of this, so you are thinking far enough indeed. nice nice. so. roadmap pls <3
Really ? are you sure Reaper does not provide this ?
Would be easy to be tested by moving the parameter by a CC and sending the same CC directly to the plugin.
Could be checked for JSFX plugins, but of course there parameter modulation works different from VST3. But if not provided for JSFX it’s unlikely that VST3 is supported on that.
BTW.: AFAIU, the first version of VST3 did not even allow for Midi CC, and they tried (unsuccessfully) to force DAW and plugin providers to use sample accurate parameter modulation instead.
BTW2.: AFAIU, with VST3 API, Parameter modulation ALWAYS is sample accurate, Of course DAWs and Plugins (like JUCE) might fake the point of time to be the start (or center) of a block.
Yes, I’ve checked sample accurate automation works in Reaper for VST3 (and Jesusonic), but at least back when I was still actively following Reaper’s development (back in 2020), they had not done anything for VST2 plugins to alleviate the automation update frequency issue. So, for example, if you have your audio hardware set to 1024 samples, the VST2 plugin automations would only be updated every 1024 samples etc.
Anyway, VST2 plugins have become a moot point by now. New ones can’t be published by new developers and Steinberg is even removing VST2 support from Cubase.
The issue with Juce is now : will we ever get the sample accurate automation support for VST3, AU, AAX?
IMHO, sample accurate parameters would need do work exactly like MIDI events, as (1) this is obviously according to their purpose being the same as Midi CC, (2) programmers that already know how to do Midi programming don’t need to relearn anything, (3) no additional CPU overhead, (4) independent of Plugin format. Maybe enhancing the Midi event type appropriately might be the way to go
Of course “standard” non-sample accurate parameter modulation should be provided in parallel.
OK, so I don’t need to do this. But AFAIU, “automation” (curves) is on the realm of the ControlPath (GUI) Thread, and not on the track’s therad. hence it always will be sloppy.
That is why I’d be interested to know this regarding parameter modulation from CC in track-Midi.
Is this even decently possible with VST2 ?
AFAIU, there are DAWs that under the hood split sample blocks to provide more accurate parameter automation for VST2. IMHO not really a good idea, as there will be plugins that do not like this at all.
No, plugin parameter automation wouldn’t work from the GUI thread, that would be an insane design IMHO. (At least as far as I’ve observed, the automation changes from host envelopes happen in the audio thread. But who knows, maybe there’s some host or plugin format that does it differently.)
MIDI CC messages are not the thing being discussed here. Those have already had sample accurate support even for VST2. But they are of very little use at least for effects plugins, because very few allow assigning MIDI CCs to their parameters. (Note that isn’t the same thing as the host allowing the MIDI CC control of the parameters. The data flow is different with that. In that case the host captures incoming MIDI CC messages and turns those into regular plugin automation changes which are subject to the same limitations as the track automation envelopes.)
That is not quite correct. You can interfere with automation from the UI threead, which will necessarily lag behind (will be sloppy). But when you play back the automation it comes from the audio thread or a thread synchronised with the audio thread.
And Midi CCs lack another property: an audio parameter knows how it should interpolate between two control values, which is not the case for midi CCs afaik.
Plus the point @kamedin brought up, that in the default VST3 scenario the processor and the editor are separated, potentially even reside on physical different machines.
I was talking about Reaper converting Midi CCs to parameter modulation. I do use this all the time for parameters of plugins that are not provided as CC receives by those.
Here it’s nasty that we loose sample accuracy with VST2.
?? with the VSTi plugins I use, most relevant parameters are provided as well in the UI as with DAW parameters as with Midi CCs. (But of course not ALL parameters).
-Michael
Again, I am talking mostly about effect plugins, not instruments. Instrument plugins of course regularly have all kinds of MIDI CC control support built-in.
Hmm.
Having the DAW interpolate (smoothen) the parameter move would result in an insane count of micro-changes, IMHO the plugin audio algorithm is supposed to do this according to it’s needs.
-Michael
Why ?
IMHO there should not be a relevant difference according to the API.
IMHO, an effect plugin is just an instrument that (usually) ignores Midi Note messages, while an instrument is an effect that (usually) acknowledges them abd (usually) ignores audio input. Moreover both might or might not send Midi messages, which is why some additionally define the plugin class “Midi Filter” that (usually) ignore audio in and don’t send audio out. To me it does not make any sense to differentiate this as “classes”.
-Michael
Why is this a problem ?
Of course parameters need to be sent to and fro between algorithm, GUI, DAW parameter interface DAW Midi (CC) interface (and e.g. with Reaper secondary “embedded” GUI).
I just did an example program exactly for this.
-Michael
I just meant that in VST3, parameter changes are communicated separately to the UI and the processing callback, and the synchronization is done by the host, not the plugin. We already talked about it. The problem with respect to the current implementation is that, in VST3, the current values for the UI thread and the audio thread can be different, as any change will propagate separately to each one. They’re not a single atomic accessed from both sides, they’re two separate values accessed from each thread exclusively.
I did not see this (explicitly) in my sample project. I do internal synchronization between GUI, algorithm parameter (and second GUI), and external synchronization from this to DAW Parameter change in and out, and Midi CC in and out with no problems.
But as this is just a proof of concept thingy, it might be that it’s not done in the most rigorous way possible.
-Michael
I mean it’s a problem in the current implementation of parameter management in JUCE, unless it’s forced by how things work in AU or AAX. In JUCE a parameter value has a single source of truth for the whole plugin, but it doesn’t in VST3, and that’s a good thing. The JUCE wrapper discards any parameter changes coming to the process callback whose values have been set before on the UI thread, and we have a single parameterValueChanged callback for both UI edits and automation. This forces to resynchronize in the plugin, with atomics or queues. If the process callback received parameter changes synchronously as it actually does in VST3, I wouldn’t have to keep a lockless fifo or alternatively check all parameters with atomic loads.
I think you are perfectly right, and afaik AAX has that separation as well, and I wouldn’t be surprised if AU has it as well.
It is a weird assumption that processing time and presentation time are identical. In practice they can never be thanks to the block based approach. The only counter argument is that for UI events the timing is way more coarse than a block anyway. But that argument falls apart, if the change is not from the UI thread.
Of course I did use a Fifo in the testing project. I did find this obvious, as different threads are used for the GUI and the algorithm.
Maybe a dedicated support by Juce for parameter transfer might be nice. But I can’t come up with a dedicated suggestion right now. If somebody might want to take a look at the testing project, I can dig it out.
-Michael
Funny this discussion popped up, was talking about exactly this with my colleague yesterday (without having seen this).
We had basically the same conclusion: the argument that “we can’t support it because some formats don’t allow it” is completely erroneous; the formats that don’t support per sample automation would just report a single value per block.
I’m curious about how much community backing there is for the idea that VST2 (and whatever the defunct pre-cursor to AAX was called) should just get dumped from JUCE. At what point do we realise holding back the whole framework just to maintain support for out-dated technology is a stupid idea? If you really need to support it use an older version of JUCE, it’s not like the users who need VST2 support are running up to date OSes anyway.