Hello mates,
I’ve written an Audio Application project with the JUCE framework, which uses a Bluetooth Speaker as the output device. I need to determine the audio latency of the Bluetooth speaker. How can I achieve this?
Thanks
Hello mates,
I’ve written an Audio Application project with the JUCE framework, which uses a Bluetooth Speaker as the output device. I need to determine the audio latency of the Bluetooth speaker. How can I achieve this?
Thanks
Record everything using a separate system, in your test rig play a tone while logging timestamps, compare timestamps to recorded audio, measure the latency thus …
There is an open thread about this:
Implementation options seem to exist, yet no answer…
I would recommend continuing that thread, since there is already some insight, even though not much
It’s about [How to detect if a bluetooth headset plugged or not ]
Can’t do it with test, I want to learn it from BT device or Juce/Windows/Device Manager etc.
Here is the GPT answer; What you think about this?
AudioIODeviceCallback::audioDeviceIOCallback()
), record the time at which the audio data is received for processing using std::chrono
or JUCE’s Time
class. This will be your "input timestamp."b. When the audio data is actually played through the Bluetooth speaker (output), obtain the current time again and record it as the "output timestamp."c. Calculate the time difference (latency) between the input timestamp and the output timestamp. This difference will give you the audio latency.this is where it hallucinates…
Latency with Bluetooth headphones can be quite a ride. Typically it can be (slowly) variable and the HPs may regulate it depending on link quality and other things. In other words, the finer details of Bluetooth audio are a huge dumpster fire and I can’t recommend getting too close to it. Dangerous fumes warning! I’ve worked on the device side for a few years and never met anyone who truly understands what’s going on (most of it’s a black box well hidden somewhere in Qualcomm’s vaults, as they make basically all the BT audio chips and the nasty stuff is hidden in their closed-source pre-certified firmware).
Most modern models will report their latency to the host, which can be used for example to sync video playback. But I have no idea how to get that information and work with it. What precision do you need? From a gut feeling I would say stay away from any idea that requires more than 10-20 ms accuracy and instead find something more fun to do.
As a Qualcomm developer, I can tell you that the only way to truly know the latency is to measure it.
Bluetooth is like the weather - its susceptible to the environment.
Log everything with timestamps, record a separate side-channel of audio, send ticks over that channel, compare the audio receipt times with the logged transmission times. The difference is your latency.
Thank you for your advice. I think I may have explained the situation a bit poorly. I can’t measure the latency of the headphones/speakers because it’s not for one time, it’ll be for the user’s headphones/speakers that connected. It is also impossible for users to test this. If I could learn it without testing, I would add a delay to the sound equal to the latency value.
My goal is:
X = Bluetooth latency
Delay Time = X