Reading multitrack audio file


#1

Hi all, I’m newbie to Juce. I have a question unresolved for me:
is there a way to get buffers (4) a multitrack (4 track) wav audiofile? If I load a file in demo player project, in getNextAudioBlock the bufferToFill.buffer->getNumChannels() is always 2. if I attempt to read to buffer 2 (0 index), I have an error.
is there a tutorial or code so I can learn?
thanks


#2

The audio is pulled. So the flow begins at the AudioIODevice, which calls from the AudioIODeviceCallback (in this example probably an AudioSourcePlayer). If you opened your device with 2 outputs, it will call getNextAudioBlock() with a buffer containing two channels. The AudioFormatReaderSource will the try to accommodate this and forget the rest of channels.

You can change the number of channels in the pipeline by passing through a ChannelRemappingAudioSource

TL;DR: open your device in the AudioDeviceManager with 4 channels, and you should see/hear all four channels.


#3

Hi daniel, thanks for the quick replay and sorry fo the delay with wich I answer.

I did some tests

OK, if I use an audio interface with 4 channels, in getNextAudioBlock method I’m able to fill 4 different buffers with the content of 4 channels of my audio file.
I do some math on buffers , with the results I fill the buffer of channel 0 and channel 1 and I can listen my work on left and right output channels. I’m not interested to the ch3 and ch4.

if I use a soundcard with only 2 channels, my app crashes.
So I setup an audioDeviceManager

deviceManager->initialise (4, 4, nullptr, true,String::empty, nullptr);

deviceManager->addAudioCallback(&audioSourcePlayer);

now the app don’t crashes but getNextAudioBlock method isn’t called.
my player setup is as follows

auto* reader = formatManager.createReaderFor (file);

std::unique_ptr<AudioFormatReaderSource> newSource (new AudioFormatReaderSource (reader, true));
                
transportSource.setSource (newSource.get(), 0, nullptr, reader->sampleRate, 4);

readerSource.reset (newSource.release());                                                        

audioSourcePlayer.setSource (&transportSource);

what am I doing wrong?
what is the right direction to follow?

thanks again, and sorry for my bad english


#4

Use your own AudioBuffer(s) for reading and processing the file, not the buffer handed to you by the getNextAudioBlock. Then you can read and process as many channels as you like, independent of how many channels the audio interface hardware has.


#5

@daniel in the ‘audioDeviceIOCallback’ the number of in/out channels reflect the number of audio interface (not the audiofile) channels, even if I open a 4 channels audio device (as in code above)

@Xenakios in which method can I get the audio buffers?


#6

Just declare a member or members like :

 AudioBuffer<float> diskReadBuffer;

in your audio processing class. Then set some suitably large number of channels and size for example in the prepareToPlay method. There may be other details to think about when using the JUCE helper classes that involve the AudioSources and AudioSourcePlayer, though…By the way, why are you using AudioSourcePlayer if you have implemented the getNextAudioBlock method yourself anyway?


#7

Yes, that’s what I tried to explain:


#8

That is more than likely to simply fail if the hardware doesn’t actually have 4 channels available. The poster’s use case is that he has a 4 channel file that needs to be somehow mapped to be played via 2 hardware channels. (Which can obviously be done in a multitude of ways but I think all will involve having some additional helper AudioBuffer in the client code.)


#9

I see, in which case the ChannelRemappingAudioSource is an option:

However, especially after looking into the sources of ChannelRemappingAudioSource, I think a bespoke down-mix audio source is beneficial, also to have more flexibility how to mix the summed channels.

And I agree, OP will need to allocate a buffer to read all channels from the source in one go.