I’ve mentioned this a few times before but I’ve now managed to implement some code to greatly improve the latency reporting on macOS.
After testing the reported and actual latencies for many devices and systems, there are two problems with the latencies reported on macOS:
- The actual latency is way higher than the reported latencies from the device
- The actual latency changes every time a device property changes (e.g. buffer size or sample rate)
I’ve drafted up some code using the input/now/output timestamps provided by CoreAudio which you can see here: CoreAudio: Changed latency reporting to update during audio callbac… · Tracktion/JUCE@d6f747b · GitHub
This seems to get the latency reported to within ~1ms of the measured latency for most devices.
This approach has the slight downside that the input and output latencies will jitter a bit but in my experience only by a few samples and the total is always the same. As you usually use input + output latencies this doesn’t really matter.
What are the JUCE teams thoughts on this? Is this likely to make it in to JUCE or can they think of a better approach?