How to measure latency generated by one (or several) plugins

is there any reasonable way to make my plugin be able to recognize latency caused by other plugins (which are before my plugin) and measure size of that latency?

This is in the responsibility of the host. It can query each plugin with AudioProcessor::getLatencySamples().
a) nobody knows if every plugin sets this value correctly and
b) your plugin has no access to that data, since it would require too much knowledge about the project setup…

And is it only that way?
Don’t you know anything easier? :slight_smile:
In my case it could be manage by two plugins. I mean situation like that:
I have other plugin coded by myself on the beginning of path (let’s call it myFirstPlugin), then other third parties plugins, and at the and myLastPlugin. Is there any way to make communicate myFirstPlugin with myLastPlugin, and let them measure latency between them?

Maybe you might be able to make some kind of a system where your first plugin in the chain generates some kind of a “ping” audio signal that your last plugin in the FX chain could recognize and attempt to measure how much it was delayed(*), but I would predict that would not work ideally. What are you even actually trying to do?

(*) It’s obviously tricky to determine what time you would measure against…

I assume the question is related to your Linear Phase EQ thread over here?

If you want to measure the impulse response of an EQ to build up something like Waves Q-Clone for EQ plugins (not sure what you exactly plan…) I think the easiest solution from an end-users standpoint would be to make your plugin able to host a the EQ plugin you measure so that your plugin is the “host” and has access to all those information you don’t get from the host otherwise. But as I said I’m not completely sure if I really understood your idea :wink:

Yes you are right, but I tried to avoid building my own host. I believe it’s a lot of work :slight_smile:

Hi, I was looking for get the latency of third party plugins (to have it exposed to the user like cubase or logic do in the gui of the channel strip in mixer).
I tried getLatencyInSamples(), but it doesn’t works (I istantiated for example an ozone 9 that has A LOT of latency and it returns “0”)… Someone knows how to do it?

As far as I know the getLatencySamples() is not for you to check any latency, but it is called by host so it can use it to calculate latency and “expose to the user like cubase or logic do in the gui of the channel strip in mixer”.

You can define what 'getLatencySamples()will return by callingsetLatencySamples()`.

1 Like

ok, so in my instance I need to calculate how much time is passed from a call to a processBlock to another and after pass it to setLatencySamples() to get this result in getLatencySamples()?

You are hosting the plugin, right? In this case getLatencySamples() is the corredct thing to do.

Chances are, the value is not set before calling prepareToPlay(), because the latency might depend on the values you specified to prepareToPlay, e.g. block size.


Arya, I don’t think that idea would work, if I’m understanding you. The processBlock() calls are not perfectly timed calls in relation to the audio, like some kind of hardware effects box chain where the audio flows continually through wires. The processBlock() functions only have to complete their work in time for the host to (at some point in the near future) feed the resulting samples to the audio output (or the next plugin in the chain). The latency is used by the host to compute which samples to send to a plugin in the chain, not to compute when to make a processBlock() call. Over time, the average time between calls in relation to the average number of samples per call evens out to equal the overall sample rate at the output, but the processing is done in little bursts by each plugin, and when they are called is not directly related to the latency of any given plugin. Latency just tells the host how many samples delayed an output sample would be that is generated by a given input sample, not how many milliseconds it is between processBlock() calls from the host.

(Sorry if I misunderstood your question and this just sounds like ranting.)

1 Like

Like @HowardAntares already pointed out, the processing time is not the same like processed time.
If you talk about time, it is always number of samples in the audio stream, never the wall clock.

If you want to measure the latency, you need to feed a known signal and observe the processed stream. Simply count how many samples delayed the signal comes back out.

But that is more for a workflow like plugin doctor, not for the regular action.

1 Like

I’m just trying to implement this

ok clear, but I’m calling getLatencySamples() in a timer callback to update a label, so I’m sure that at some point prepareToPlay of interested processor is called… right?

If you are hosting the plugin yourself, it’s your responsibility to call prepareToPlay on the plugin instance. But it’s not even clear what you are actually doing? Are you hosting the 3rd party plugin yourself?

1 Like

yes but I instantiated a an AudioProcessorGraph in my AudioProcessor and then I load plugin in those graph, my prepare to play up to now is this:

    graph.setPlayConfigDetails (getMainBusNumInputChannels(), getMainBusNumOutputChannels(), sampleRate, samplesPerBlock);
    graph.prepareToPlay (sampleRate, samplesPerBlock);
    if (getPlayHead())
        for (AudioProcessorGraph::Node* node : graph.getNodes())

when plugin is added in graph is the graph that calls prepare to play of this one.

I’m building a plugin that inside host third party plugins (think about it as a juce AudioPluginHost that can be loaded in daws ad plugin)

Without a specific known signal, such as a pulse at a known sample position, if the hosted plugin does not report its latency accurately to your hosting plugin, there is no way to know what it’s latency is. It’s completely unrelated to when your hosting plugin’s processBlock() is called. That’s under the control of the actual host (DAW). The latency is a measure only of the number of samples later in the timeline that an output value is generated for a given input sample, not the length of time between processBlock() calls. And setting the latency “live” (while the transport is running) might not even work. Not something I’ve looked at. As far as I know, when you change the latency reported (by your hosting plugin), the host can’t change anything during actual playback. It might be able to do so between its calls to your processBlock(), but I don’t think that’s a requirement that hosts would have to follow, and don’t know whether any hosts allow that. They need to recompute where in the timeline the next buffer will come from, which should be computed before the transport starts, otherwise it would cause breaks in the sample position that the hosts’ playhead sits at.

So what I think you really need is accurate latency reporting from the hosted plugin(s) to be able to implement the latency compensation yourself internal to your hosting plugin. Without that, you could only do this via sending a known signal through and detecting the output of that known (but presumably altered) signal at the output of the hosted plugin, which would be a completely separate step in a process from trying to do this while processing any actual audio.

1 Like

Even with a known signal, it might not be possible to do more than estimate the latency, unless you could guarantee the ability to recognize which input sample led to which output sample. A phase shift or delay or anything that transformed the input signal’s timing would break your ability to know what was caused by latency and what was just an artifact of the plugin’s processing of that signal.


Yes, for my purpose I need only to get the num samples of latency of my graph and of each node inside it… but I don’t know how to do… I don’t want at the moment I want only print that num on a label…