Juce Play Audio From Resource


I am having difficulty playing a camera shutter sound from my Juce application’s ‘BinaryData’ resources, something like the cello.wav example in the JuceDemo AudioDemoSynthPage.

I can open an audio reader like this:

WavAudioFormat wavFormat; ScopedPointer<AudioFormatReader> audioReader (wavFormat.createReaderFor (new MemoryInputStream (BinaryData::cello_wav, BinaryData::cello_wavSize, false), true));
But, I cannot figure out how to play it.
There are examples using the synthesizer or opening files from the filesystem, but I cannot put it all together from the resource file as in “cello_wav”.
Does anyone have an example?
Thank you.


I partially answered my own question.
I currently have an implementation that behaves as follows on the various platforms:

[list] - Mac: Success. Sound plays.

  • iOS: Works, but requires the modification commented below. But, sound plays.
  • Android OS 4.1: Works for the first audio playback, but crashes the app on the second
    (Difficult to debug the JNI output, but it appears to be: “JUCE Assertion failure in juce_File.cpp:149”)
  • Windows 8: Fails to play sound, but otherwise works with no apparent errors.
  • Linux: Not tested yet.[/list]

I am using JUCE v2.0.40.

Please look at this code and show me some of the mistakes that keep it from working on Android and Windows.

Class variables:

    // Audio
    AudioFormatManager *audioFormatManager;
    AudioFormatReader *audioFormatReader;
    ScopedPointer<AudioFormatReaderSource> audioFormatReaderSource;
    AudioTransportSource *audioTransportSource;
    AudioSourcePlayer *audioSourcePlayer;
    AudioDeviceManager *audioDeviceManager;

Call this init function from the Component constructor:

[code]void MyComponent::audioInit()
audioFormatManager = new AudioFormatManager();

audioTransportSource = nullptr; // new AudioTransportSource();
audioSourcePlayer = new AudioSourcePlayer();
audioDeviceManager = nullptr; // new AudioDeviceManager();


And, call this release function in the destructor:

[code]void MyComponent::audioRelease()
delete audioDeviceManager;
audioDeviceManager = nullptr;

delete audioSourcePlayer;
audioSourcePlayer = nullptr;

delete audioTransportSource;
audioTransportSource = nullptr;

audioFormatReaderSource = nullptr;

audioFormatReader = nullptr;

delete audioFormatManager;
audioFormatManager = nullptr;


And finally, the player code:

[code]void MyComponent::playCameraShutterSound()
WavAudioFormat wavFormat;
MemoryInputStream *mis = new MemoryInputStream (BinaryData::cello_wav, BinaryData::cello_wavSize, false);
audioFormatReader = wavFormat.createReaderFor(mis, true);
audioFormatReaderSource = new AudioFormatReaderSource(audioFormatReader, true);

// Note: Mac OS X allows reuse of the audioTransportSource object,
// but iOS forces creating a 'new' on every time the sound is played.
if(audioTransportSource != nullptr)
    delete audioTransportSource;
audioTransportSource = new AudioTransportSource();
audioTransportSource->setSource(audioFormatReaderSource, 0, nullptr, 44100.0, 2);

if(audioDeviceManager != nullptr)
    delete audioDeviceManager;
audioDeviceManager = new AudioDeviceManager();
audioDeviceManager->initialise(2, 2, 0, true);



I´m struggling to play a sound as well. I think it would be good to add a facade class to JUCE that setups up the various manager/reader/player classes for the client, and avoid having to code at this quite low level of abstraction.