AudioProcessorGraph used for rendering files

Hi,

I’m experimenting with using the AudioProcessorGraph to render files. That is, have an input file(s), process it through an AudioProcessorGraph, and the write the output to file(s).

My thought was to create an AudioProcessorGraph, load a filter graph with AudioProcessorGraph::loadFrom, and then call AudioProcessorGraph::processBlock while providing input samples from a buffer (read from the file).

{
    AudioProcessorGraph graph;
    
    graph.loadFrom (fileToOpen, true);

    graph.setPlayConfigDetails(2, 2, 44100, 512);
    graph.releaseResources();
    graph.prepareToPlay(44100, 512);

    graph.setNonRealtime(true);

    juce::AudioBuffer<float> audio_buffers;
    audio_buffers.setSize(1, 512);
    juce::MidiBuffer midi_buffers;

    for (int32_t currentSample = 0; currentSample < 220000; currentSample += 512)
    {
        /*
         Read data into the audio_buffers buffer
         */

        // Call processBlock
        //
        graph.processBlock(audio_buffers, midi_buffers);

        /*
         Write data from the audio_buffers to a file
         */
    }
}

What I’m finding is that the input buffer I’m passing into AudioProcessorGraph::processBlock doesn’t end up getting processed through to the plugins (I’m using the demo gain plugin to see this) and I don’t get any audio in my buffer once AudioProcessorGraph::processBlock is completed.

The filter graph was created using the AudioPluginHost application and only contains a gain plugin.

I’ve seen posts about using a graph in a plugin and calling graph.processBlock directly. I would have thought this would work in my simple example.

Is there something required in the configuration of the AudioProcessorGraph to get it to process the data coming into the processBlock?

Thanks
Bob

Did you check that the graph actually gets plugins instantiated and connections are set between them? I don’t suppose you could directly use the AudioPluginHost project files as state of the AudioProcessorGraph.

Yes, plugins are getting instantiated and addNode is getting set between them.

I’m using the AudioPluginHost project files as state of the AudioProcessorGraph.

A question in my mind is if the graph state (filter graph) doesn’t have an inputs or outputs assigned, does calling AudioProcessorGraph::processBlock call into the first plugins inputs directly or are these “dangling” inputs or something like that.

The graph does need the special input and output nodes set up and connected into the actual processing nodes.

Could you clarify this statement?

What do you mean by special input and output nodes?
Are these derived from specific type?
Or just an AudioProcess with only outputs (for my source files) and inputs (for my destination files)?

Here are screenshots of 2 filter graphs…

Screen Shot 2022-01-12 at 6.55.15 PM

One shows a graph with my the Gain Plugin. The other shows a graph with my file player (reader) and recorder (writer).

Are you saying that you must have the one with the player and recorder for processing to work properly?

If you get the connections sorted another caveat:

If you process offline make sure any pre-buffering sources are set to blocking mode. Otherwise it is pulling faster than the buffering audio source can provide samples and it will send empty buffers or worse.

1 Like