JUCE Signal Path / Structure

Recently browsed through the Traktion Engine and wondered… if there is a, ‘general’ signal path / structure using Juce? I’ve had a look through various previous discussions and couldn’t find a definitive answer. Perhaps there isn’t one.

Found this thread several years ago @kraken : Which describes a signal path.
(AudioSources and playing wavefiles at specific offset)

Is this the general path most developers use?

Say one was designing a small audio mixer application ->

  • In general, an audio track will have multiple audio clips in sequence… How do you line up each audio clips buffer?

  • Is a AudioFormatReaderSource generally used per track? Or does each track consist of numerous, each reading an audio clip and passing it’s AudioSampleBuffer to
    **void** MainComponent::getNextAudioBlock ( **const** AudioSourceChannelInfo& bufferToFill)

  • Assuming each audio file conforms to AudioFormatManager and AudioFormatReaderSource, -> (eg, AudioFile Class).

  • Does anyone use a 'master’ AudioFormatReaderSource, reading ahead of the playback head?

  • Should there only be ONE AudioTransportSource for the application?

Just wondering how others have structured their application? Any insight, would be :slight_smile:

1 Like

Anyone?