Reporting a changed latency in VST plug-ins

Some plug-ins change their latency at runtime, but the current VST wrapper implementation does not support reporting the changed latency to the host during playback.

I have found that this can be easily achieved by changing the current:

    void audioProcessorChanged (AudioProcessor*)
    {
        updateDisplay();
    }

to

    void audioProcessorChanged (AudioProcessor* processor)
    {
        setInitialDelay (processor->getLatencySamples ());
        ioChanged ();
        updateDisplay();
    }

in juce_VST_Wrapper.cpp

This way, hosts that support a change in latency during playback get notified of the changed value in real-time

Nice!!

I’m wondering… In which case such an latency change event can occur ? For me, in most of DAWs I use (Ableton Live and Reaper mainly), when the buffer length change, the user has just edited the properties of the audio soundcard panel in the host, and the audio was stopped. So, when the latency change is effective, the prepareToPlay function is called first, before any processReplacing callback…

Can you tell me a specific case where this is wrong ? I have heard about some specific behaviour in Logic…

Hi Wolfen, you’re thinking of the audio device driver/buffering latency. This topic is relating to additional latency that a plugin may add to the audio.

Ooops, my bad :oops:

Good idea, thanks! I’ll sort this out today.

I noticed the change, and it seems that when audioProcessorChanged is called, you query for the latency amount the AudioProcessor owned by the wrapper (i.e. the “filter” member) rather than the “processor” instance that was originally meant as the argument to the callback in the code I posted before.

I am aware that in all meaningful cases both pointers will refer to the same instance of AudioProcessor. I was wondering if there are cases when this is not true and, if they exist, which of the two implementation is the “correct” one. Is there a particular reason that made you choose to use the internal member rather than the one that triggers the call?

[quote=“yfede”]I noticed the change, and it seems that when audioProcessorChanged is called, you query for the latency amount the AudioProcessor owned by the wrapper (i.e. the “filter” member) rather than the “processor” instance that was originally meant as the argument to the callback in the code I posted before.

I am aware that in all meaningful cases both pointers will refer to the same instance of AudioProcessor. I was wondering if there are cases when this is not true and, if they exist, which of the two implementation is the “correct” one. Is there a particular reason that made you choose to use the internal member rather than the one that triggers the call?[/quote]

Well, yes, in reality it will always be the same pointer, but I used the member variable because that’s the same object that a call to setInitialDelay will affect. If the ‘processor’ parameter is a pointer to some other object, then its latency is irrelevant to this instance of the plugin.

Despite having been the one who submitted the change, I am now building a plug-in which has presets with different processing latencies and I’ve found out that some big hosts like for example Sonar or Cubase don’t play nicely with the fix above. The call to “ioChanged()” at unexpected times (i.e. during processing) seem to be the culprit of it all: it forces those hosts host to suspend() and then resume() after changing the latency and this results in a pause during playback. It is short, but it is clearly audible.

This said, what would you think about changing the code above as follows?

    void audioProcessorChanged (AudioProcessor* processor)
    {
        setInitialDelay (processor->getLatencySamples ());
        updateDisplay();
    }

thus removing the ioChanged() call?
Clearly, I suspect this will leave the old initial latency (it is called “Initial” for a reason) until processing is suspended and resumed again

We encountered the same problem (which sometimes also lead to crashes), but moving setLatencySamples outside the processBlock (using an AsyncUpdater) seemed to fix it.
AFAIK Ableton Live changes latency only after receiving setInitialDelay and ioChanged calls but not when starting playback, so it would only use the latency when loading the plugin without the ioChanged call.

Chris

[quote=“ckk”]We encountered the same problem (which sometimes also lead to crashes), but moving setLatencySamples outside the processBlock (using an AsyncUpdater) seemed to fix it.
AFAIK Ableton Live changes latency only after receiving setInitialDelay and ioChanged calls but not when starting playback, so it would only use the latency when loading the plugin without the ioChanged call.

Chris[/quote]

Thank you, this is helpful. Could you elaborate further about the implementation that uses the AsyncUpdater?

If I understand it correctly, you have added it to the VST wrapper, as it seems to me that it is the only place where you could reasonably call ioChanged() in the handleAsyncUpdate(), is this correcto?

Could you post some code?

That might be even better. :slight_smile:
It is only in one plugin where we did something like this:

void processBlock([...])
{
  [...]
  if (latencychanged)
  {
    // setLatencySamples(newLatency); [OLD, caused crashes]
    internalLatency = newLatency;
    triggerAsyncUpdate();
  } 
}

void handleAsyncUpdate()
{
  setLatencySamples(internalLatency);
}

But it should be fairly easy to make JuceVSTWrapper inherit from AsyncUpdater and put ioChanged inside handleAsyncUpdate.

Chris

Ah yes, I didn’t read your message carefully, it was clear already :slight_smile:

I’ll give a shot to this and let you know if it works even when it’s put in the wrapper

Well, it still causes the audio to pause when the ioChange() is performed, but at least some nasty crashes and behaviours are fixed now.

One, in particular, was quite annoying: when loading a session in Cubase where the plug-in state implied a latency different from the one of the plug-in in its “just inserted” state, the whole session didn’t output any sound when played even if the meters in the GUI moved as if playing was actually happening.

This problem has been solved using the deferred call to ioChanged() with the AsyncUpdater: it was caused by multiple ioChanges() being called due to repeated latency changes during the instantiation process. By using the AsyncUpdater, all these latency changes get “squashed” into a single ioChanges() at the end of the instantiation, so this is how I do it now.

     void audioProcessorChanged (AudioProcessor* processor)
     {
        setInitialDelay (processor->getLatencySamples ());
        triggerAsyncUpdate ();
        updateDisplay();
     }
 
    void handleAsyncUpdate ()
    {
         ioChanged ();
    }

Jules, what do you think about it? (obviously, the main VST wrapper class has become an AsyncUpdater, like this:

class JuceVSTWrapper : public AudioEffectX, private Timer, private AsyncUpdater, public AudioProcessorListener, public AudioPlayHead

One, in particular, was quite annoying: when loading a session in Cubase where the plug-in state implied a latency different from the one of the plug-in in its "just inserted" state, the whole session didn't output any sound when played even if the meters in the GUI moved as if playing was actually happening.

Thanks, this is exactly the issue i have now. The fix seems to work fine in Cubase!

Jules can you have look at this, alternatively this fix only could be applied for Cubase (& Nuendo, and not to affect other hosts that may behave different)

 void audioProcessorChanged (AudioProcessor*)
    {   
        setInitialDelay (filter->getLatencySamples());
        
        const PluginHostType host (getHostType());
        if (host.isCubase() || host.isNuendo())
        {
            triggerAsyncUpdate ();
        } else
        {
            ioChanged();
        }
        updateDisplay();
    }

bump

I hear you! Can't see any issues with making this asynchronous, I'll add that asap, thanks!