How does the audio plugin host example actually tick

I’ve been fooling around with the audio plugin host example to see what it’s made of out of interest in building a simple daw for personal use and testing of user interface ideas. Now I’m in a situation that I’d like to make a rendering feature for the graph that works in non-realtime fashion. But here is the thing: I have not been able to figure out where or what actually runs the whole system. When nodes are added to the graph, it doesn’t even matter wether they are connected to any other node or not, processBlock is always called for each and every node on the graph “per frame”.

So my question is, how can I stop this? I’d like to have a play/pause feature for this graph so that I’d be able to create my own thread for offline rendering that’d call processBlock for the nodes as fast as they can process the audio, the way it was suggested in these other threads: Offline Rendering? and Offline Rendering

I’m assuming that my to-be rendering thread could not operate on the graph on the same time than the example is running it in realtime. And nonetheless, a play/pause/stop feature is anyway kind of a must feature when moving forward :slight_smile:

As I understand it it’s the AudioIODeviceCallback::audioDeviceIOCallback that is called by the audio driver to fill an AudioBuffer with sample data.

If there was the inheritance diagram available on the API doc (samuels alternative doc), you could see, that either AudioProcessorPlayer or AudioSourcePlayer are possible callbacks for the audio driver.

The problem is, if you have connected it to an AudioSource, then each time getNextAudioBlock advances the played material, so you cannot call this from two separate threads without creating a mess.

In this case be sure not to use any BufferingAudioSource, because they will deliver silence, if they didn’t get the chance to buffer new data while the last block gets played back.

I suggested once a wait method, which I called like:

if (isNonRealtime())
    waitForNextAudioBlockReady (info);
getNextAudioBlock (info);

That way I can use the same sources and spread the loading on different cores via TimeSliceThreads.

Here is the wait method:

Hmm, I’m a bit confused by your answer as I think it goes a bit too far considering what I’m currently trying to figure out. Eventually I do want to be able to create the offline rendering pipeline, but for now I’m mainly wondering what it is in the plugin host example that runs the whole show; ie. what are the lines in the source code that make this engine run in a realtime fashion?

As I understand it it’s the AudioIODeviceCallback::audioDeviceIOCallback that is called by the audio driver to fill an AudioBuffer with sample data.

I however have such a configuration in my graph that there are no AudioIODevices:

Both of those nodes are built in classes to the plugin host example that I have derived from AudioProcessor class. The most important methods that I implemented for both of them are of course prepareToPlay and processBlock, and their details are this simple:

`

void FileWriter::prepareToPlay (double sampleRate, int samplesPerBlock)
{
    if (filePath != "")
    {
        File file (filePath);
        if(file.existsAsFile()) file.deleteFile();
        
        outputStream = new FileOutputStream(file);        
        fileWriter = wavAudioFormat.createWriterFor(outputStream, 44100, 2, 24, wavAudioFormat.createBWAVMetadata("","","",Time::getCurrentTime(),0,""), 0);
        
        if (fileWriter != nullptr)
        {
            outputStream->flush();
        }
    }
    
    node->properties.set("file", filePath);
}

void FileWriter::processBlock (AudioSampleBuffer& buffer, MidiBuffer& midiMessages)
{
    const int totalNumInputChannels  = getTotalNumInputChannels();
    const int totalNumOutputChannels = getTotalNumOutputChannels();
    
    int numSamples = buffer.getNumSamples();
    
    fileWriter->writeFromAudioSampleBuffer(buffer, 0, numSamples);
    filePosition += numSamples;
    
    if(filePosition % 44100 == 0) outputStream->flush();
}

`

These two methods get called by something in the plugin host example, but what exactly? And they get called in a fashion that the file gets read, written and and played in realtime, not as fast as possible. I haven’t found a line of code where those methods get called, so it must be something that is happening somewhere inside the JUCE classes and is inherent to their functioning, right? And if not, I’d gladly learn what or where it is that runs that thread so I could turn it off and replace with that kind of solution you suggested :slight_smile:

I’ll most probably need some clarification for that also, but for now I’m stuck in the very beginning in understanding how this plugin host example / JUCE does it’s thing of realtime rendering of the audio from any given graph; even the one’s that do not contain any AudioIODevices or VST/AU’s, just custom made / built in AudioProcessors.

I’ve tried to setNonRealtime for each and every one of the nodes in the graph, as well as the main AudioProcessorGraph that holds all the nodes in the graph in FilterGraph.cpp, but to no avail. After setting the nodes and the graph will inform me that they’re in an offline mode when I call that isNonRealtime-function, but they’re still running realtime as usual. I’m not, and I were not expecting those setNonRealtime/isNonRealtime to actually do the magic as such, without any user implementation for the actual rendering part, just saying that I’ve tried that also out of curiosity / desperation :joy:

Hi zax,

If you look at the code in the constructor of MainHostWindow you’ll see that we initialise an AudioDeviceManager and pass this to the constructor of GraphDocumentComponent. Here we add an audio callback to the device manager, where the callback will be called repeatedly by the default audio device of your system.

…and the GraphDocumentComponent in file GraphEditorPanel.cpp line 1110 hooks up the graphPlayer into the audioDevice:

deviceManager->addAudioCallback (&graphPlayer);
deviceManager->addMidiInputCallback (String::empty, &graphPlayer.getMidiMessageCollector());

Good luck