I’m just converting my app from CoreAudio to JUCE and I’d like to get some advice to get started.
- Output routing
My app currently has a rather sophisticated routing matrix which lets you create multichannel output busses and assign them to the currently selected audio device’s outputs - similarly to how Pro Tools does it. I’m currently using an audio output bus class which internally holds an AUGraph to do this with this setup:
Output Unit <-- Mixer Unit <-- Audio Source
The mixer unit is there to add multiple audio sources to the same output bus.
Each audio output bus has a channel map and thus only gets callbacks for the used channels.
Is a similar behavior doable with JUCE? Is there an overhead involved if I just use the standard AudioDeviceIOCallback with all channels the device provides and just discard the channels I don’t need? I’m not sure if this is what the AUGraph does internally somehow when a channel map is set.
- Is it common to start/stop the audio device when no audio is needed or should I just let the audio running and pass zeroed audio buffers? I know that starting up the device can take some time so I’m always wondering if I should just let the audio output loop running.
Thanks and best,