Performance of Multichannel


Hi everybody,

for my music app I have multichannel files (usually ~60 channels), that are mixed during playback to stereo. This works fine on desktop machines, but on android I have a performance problem (average machines, MacBook Mid 2015: 15% CPU and a Windows PC ~10 % - debug doubles the numbers roughly).
The audio file is read on a background thread (BufferingAudioSource), so that the audio thread does only the mixing.

My android phone (Galaxy Note3neo aka SM-N7505) can play it back using uncompressed wav, but stutters when playing from ogg. Same problem, if I use 30 stereo stems buffered from 4 background threads.

Now the question: does anybody know a file format, that would work better? I saw @SKoT’s BDFlac, but he said, it optimises for decaying material in particular. We could do something similar, e.g. skip a third of the channels that are currently not played. But that would mean to implement a whole new audio format…

Any suggestions?



Have you looked into optimized ogg decorders for ARM? From past game developing I know there are some libs that run much faster on ARM than the standard libvorbis - for instants “tremolo”:

I’m not sure how that could be hooked into Juce, but as far as I remember performance is much better than the regular library.


Thank you!
It is a bit dated (last sign of activity 2009), but I will have a look, if I get it compiled and if it serves my purpose.

Still collecting options…



You might be able to use my lossless audio compression algorithm I wrote a few months ago.

I didn’t test it on Android, but it runs fine on iOS, so it should not be too hard to adapt.


Cool, thank you @chrisboy2000.
I remembered your post, but I assumed wrongly, you teamed with Skot, because you referenced his paper.

I will have a look and let you know, if I will use it.
Btw. when browsing and looking, if it does multichannel, I stumbled over a typo:

But that’s just an assert, doesn’t affect functionality…



Also @daniel, try doing more processing on the audio thread if you can. The audio thread on android has a much higher priority than other threads. Alternatively, you can also use the new realtimeAudioPriority in juce_Thread.h:176.


Wow, this is a useless assertion indeed :slight_smile:

FYI, it only supports stereo files, but you can easily divide your stream into 30 stereo files and read it simultaneously.


That is how I worked before. It works ok, but
a) it eats 50% CPU and it is meant as background music for games, so I have to share, I cannot have the resources for my own. And
b) I buffer on a background thread (when using 25 stems I used 4 background threads), so if the get stuck, everything get’s out of sync…

Did you compare the benchmark against JUCE’s ogg reader? If it is much faster, it is still worth a try…


Thanks, that’s very interesting. Currently on the background threads I have only the reading into BufferingAudioSources (either multiple stems, or one gigantic 60 channel file). I’ll have a go with the priorities. But I have to save resources.
I even thought decoding into 16-bit integer MemoryMappedFiles and mix before converting to float. But that makes only sense, if I can use SIMD operations for that… The Audio files are looping, length is 40-60 seconds… In float thats ~660 MB, integer would still be 330MB… doesn’t sound like an option…


I would recommend using only one background thread for the streaming. Also whether you have one file with 60 channels or 30 stereo files should not make a difference performance wise, however the codec will most likely be the bottleneck.

I didn’t make any comparison to OGG, but my codec is about 8x - 10x faster than FLAC (although it doesn’t compress as good). Actually the performance overhead compared to PCM is negligible (I can decompress 100+ voices on older iOS devices without issues).


Hi Fabian,

Could you please explain a bit further how to force this realtimeAudioPriority on the audio thread? I tried to debug the juce_createRealtimeAudioThread function, but it’s never called in my app, even though the OpenSLAudioIODevice is properly loaded. It’s also never called in the JuceDemo.

I’m currently testing on a Samsung A5 device, running Android 6.0.1.




Hi Mariano,

The audio thread will always be on a realtime audio priority thread as long as you use the default sample rate and buffer size. The realtimeAudioPriority is only intended for extra audio render threads that you create via the Thread class.

In fact, the only way on Android to create such a high priority thread is to open a dummy audio stream - this is exactly what juce_createRealtimeAudioThread does.

Does that help?


I see… I focused on this because the latest version of my app I’m working on produces serious glitches on a fluctuating basis, as if the audio thread would not be of the highest priority. My code is exactly the same, only thing that changed is that last version was using JUCE 4.3 and this one is on JUCE 5.2. Any other development you can think of that would have an impact on this?

Strange thing is that when I disconnected then reconnected the (mini-jack) headphones, I could see the following message from the Android media code:

W/AudioTrack: dead IAudioTrack, PCM, creating a new one from …

Then playback was perfect. Could this help?




OK so several things come to mind:
1 ) Playback performance is always better on headphones compared to the internal speaker. This is because Android adds extra DSP processing when using the internal speaker - and this will also depend on the handset that you are using. Are you still testing on the same headset compared when you tested JUCE 4.3?
2 ) The audio thread will only run on the low latency thread if you are using the default sample rate and buffer size. Check this debug message to see if the chosen sample rate/buffer size matches the native sample rate and buffer size.
3 ) As the juce audio thread runs on a high priority thread maybe you have some sort of priority inversion. For example, you should definitely not do anything that calls into Java (for example posting messages) on the audio thread. To check this, try commenting out this line. Without this line any call into Java will crash the app. You can then look at the stack trace to see which call tried to invoke a Java method.


Thanks for the quick, extensive reply:

  1. Yes, I’m just using standard mini-jack plugged headphones, but the problem happens with or without them. While it works perfectly on previous app version.

  2. I commented that line and app ran without crashing. So I don’t think I’m making any Android call on the audio thread.

  3. I do have a discrepancy in the buffer size reported. Here is the log I get:

JUCE: JUCE v5.2.0
JUCE: Audio device - min buffers: 15392, 7680; 48000 Hz; input chans: 2
JUCE: OpenSL: numInputChannels = 0, numOutputChannels = 2, nativeBufferSize = 240, nativeSampleRate = 48000, actualBufferSize = 1440, audioBuffersToEnqueue = 8, sampleRate = 48000, supportsFloatingPoint = true

My device initialization is the following:

deviceManager.initialise(0, 2, 0, true, String::empty, 0);

Hope this helps.



Oh wow, if I force the buffer size to 240, then sound is perfect! I would never have thought that reducing the buffer size would remove the glitches. Would have kept increasing it instead.

How can we force the buffer to the native size by default?


Which phone are you trying this on?


Samsung Galaxy A5


Yes. I almost thought so. This is a problem with all Samsung phones. Unfortunately, Samsung phones “lie” about their support for realtime audio priority (“Pro Audio”) and small buffer sizes (“Low Latency Audio”), so JUCE does not use the realtime audio priority on Samsung phones.

You can override this yourself by changing the following line of code to always return true. Note however, that this change may decrease performance on low-end Samsung phones which actually do not support realtime audio priority.

Rumor is that they do not want to return true here as the Pro Audio badge is a Google invention and is in competition with Samsung’s SAPA framework (which JUCE does not support). Not sure if that is true though.


Yes, that did the trick.

Is there any way we can play safer by setting a larger I/O buffer while still running on a realtime priority? I may have many users still on old Samsung phones, and in this app sound quality is much more important than low latency.