How to report Plugin latency?


Having a problem, where using several FIIR filters, and processing all of them introduces an expected delay. Now I am trying to call setLatencySamples() to report this delay, but it does not seem to work.

On playback there is enough wait time between each audioBlock that it works fine, but if the audio is bounced offline then the filters’ processing is lost.

In Logic there is no apparent change when calling setLatencySamples(). In Reaper there is an ever weirder behavior, when playing an audio, there is a visual/auditive change, where everything else starts playing but the channel with the Plugin kicks after the delay, and when rendering, the output is not filtered.

Please help.

Set it Asynchronously


Wow, your response was fast. Thank you Rail.

How do you set it Asynchronously?

Do you mean the setLatencySamples()?


Which version of JUCE are you using? Depending on the version, the propagation of setLatencySamples might behave differently. For instance, whether or not you have this change could affect the behavior:

Also, you might try stepping in to the setLatencySamples call, in particular the updateHostDisplay to see whether the change is taking effect in the JUCE plugin wrapper layer

When are you setting the latency? We report it from prepareToPlay(), and also when/if the user selects a Low Latency mode option. Not during playback, though. Has to be set before playback starts.

There are some plugins which change their latency dynamically based on the parameters/algorithm… I assumed the OP was trying to set the latency from inside the processBlock() which can be done asynchronously - the easiest way to do that is derive the AudioProcessor class from AsyncUpdater and have a member variable for the delay length… and in processBlock call triggerAsyncUpdate() and in handleAsyncUpdate() call setLatencySamples().

If your latency is fixed you can call setLatency() from prepareToPlay().



Juce version 7.0.5 on a Mac Studio. I guess it includes the change already.

I tried setting the latency from the processor constructor, from prepareToPlay and from the actual processBlock. Same results.

Actually, after checking the behavior in detail, I see a similar result on both DAWs on playback.

setLatencySamples() is called at the start of prepareToPlay(). When playing the track on Reaper what happens is that the audio starts ahead the number of samples defined, not playing the initial samples. And when playing on Logic the initial silence is the only portion not played, Logic jumps to the beginning of the sound disregarding the number of samples defined.

The latency setting has no affect on what’s sent to the processBlock… it’s used in the host’s Delay Compensation to delay other tracks (if necessary) to compensate for the delay in the plugin chains.


1 Like

One sanity check you could do is to check the DAW’s view of the latency for the plugin. In Logic you can do this by hovering over the insert and viewing the latency in samples. This is the latency that the DAW expects to be reported by the plugin via setLatencySamples. If this latency doesn’t match what you expect, then there might be something wrong with the way you are calling setLatencySamples and you’ll have to look into why that value isn’t making it through all the way (because then the DAW’s delay comp won’t work as you expect).

However, if this value is what you expect, then the issue might actually be with the plugin implementation and its assumptions about how the latency should be working, which would be a different path of debugging

Thank you Rail and Jack

I am able to see the latency correctly on Logic. But my problem is not fixed with it at any value. Then the origin of the problem is beyond my comprehension. Let me describe it in a concise way…

The final product is a plugin. Actually you can check it here… Chimerator

Basically it has several IIR filters, and at the same time it has some frequency visualization (not relevant) and a dry/wet mixer

Filters are setup on prepareToPlay, and on processBlock each one calls .process()

Then for the dry/wet mix it is necessary to store a copy of the buffer before filter processing to use it later at the end of the processBlock

The plugin works fine at playback. However the problem appears when bouncing the processed track offline (without hearing it). In which case, the filters do not affect the bounced track, just the mixing phase. It other words, at the mixing phase, the signal that is supposed to be filtered is just the original one.

It sounds as if your filters are checking if the transport is running in real time or not someplace. That part, at least, doesn’t sound related to latency, but to whether it’s running in real time or not. I’d look in my code if there is such a check being done anyplace, and see if that could possibly affect the output of the filters.

Or maybe you’re checking for isPlaying and the host isn’t setting that flag when bouncing offline?

Another idea comes to mind, and that is what if the buffer(s) your filters are being passed are not writeable when not in real time, because you’re assuming the input and output buffers are the same, but they’re not in that case, and your filters don’t write to where you think they are writing?

Just a few random thoughts; no idea if any hosts do that or if your code would handle it properly or not.

… look for AudioProcessor:: isRealtime()


Thank you again for the fast response.

Well, in the code there is no call for isPlaying() or for isRealtime()

I disabled the wet/dry mix code to check if somehow it created the problem and it continues. Then the problem is at the filters level. Somehow, when in offline mode, either they do not process, or they take too much time to process and the result never arrives on time.

Possible clues: There are approximately 50 filters (maybe there is a limit).
And the call to process each one on the processBlock goes like this…

EDIT: After reducing the number of filters, the problem persists.

I doubt it has to do anything with latency. Maybe you are making the implicit assumption about mapping of input/output blocks (e.g. input block = output block, in place), which is true in realtime, but may be handled differently by the host in offline mode? Just a shot in the dark though.

Well, this is the basic original structure within processBlock:

scratchBuffer.copyFrom (buffer)
filters.process(ContextReplacing(on buffer))
buffer.WritePointer = buffer.WritePointerWet + scratchBuffer.WritePointerDry

It was changed to the following to check:

scratchBuffer.copyFrom (buffer)
filters.process(ContextReplacing(on scratchBuffer))
buffer.WritePointer = scratchBuffer.WritePointerWet + buffer.WritePointerDry

But the same problem continues to happen.

Found the problem. Nothing to do with processBlock or latency.

In case this happens to other people…

prepareToPlay() had a basic filter initialization, but the real one happened after control variables changed, calling updateParameters().

The solution was simple, call updateParameters() at the end of prepareToPlay().

Thank you all for the help.