Low latency audio / static mode instead of stream mode

Hi Jules,

Is there a reason why you have chosen stream mode in AudioTrack versus static mode which it seems would lead to lower latency so probably more suitable to Juce use cases ?

Thanks,

Humm after looking deeper. This is not very suitable for realtime rendered sounds.
Moreover it looks to have some issues

http://mindtherobot.com/blog/555/android-audio-problems-hidden-limitations-and-opensl-es/

Still, I was wondering if there isn’t an issue on the following code

outputDevice = GlobalRef (env->NewObject (AudioTrack, AudioTrack.constructor, STREAM_MUSIC, sampleRate, CHANNEL_OUT_STEREO, ENCODING_PCM_16BIT, (jint) (actualBufferSize * numDeviceOutputChannels * sizeof (float)), MODE_STREAM));

shouldn’t it be sizeof(uint16) instead of sizeof(float) ?

same thing for input device opening.

I was actually planning to do a version that uses openAL where possible, which should be much better than any java-based implementation. The AudioTrack code is lowest-common-denominator stuff that should work on any machine, which is why I did that first.

from what I’ve read the OpenAL/OpenSL is built on top of AudioTrack so it’s maybe not very interesting.

http://groups.google.com/group/android-ndk/browse_thread/thread/49416a16ad80a2a5
http://music.columbia.edu/pipermail/andraudio/2011-June/000220.html

What about the sizeof by the way ?

Thanks,

Someone else told me they’d been doing some tests, and openAL gave far better performance… So I guess some devices don’t have a native openAL layer, so use a cross-platform AudioTrack based fallback?

Sorry, I missed the bit about the sizeof… Yes, I think you’re right there, thanks!

humm interesting.

Do you know which device the guy was using ?

Thanks,

Sorry, not sure what the device was.