Hmm, I’m a bit confused by your answer as I think it goes a bit too far considering what I’m currently trying to figure out. Eventually I do want to be able to create the offline rendering pipeline, but for now I’m mainly wondering what it is in the plugin host example that runs the whole show; ie. what are the lines in the source code that make this engine run in a realtime fashion?
As I understand it it’s the AudioIODeviceCallback::audioDeviceIOCallback that is called by the audio driver to fill an AudioBuffer with sample data.
I however have such a configuration in my graph that there are no AudioIODevices:
Both of those nodes are built in classes to the plugin host example that I have derived from AudioProcessor class. The most important methods that I implemented for both of them are of course prepareToPlay and processBlock, and their details are this simple:
`
void FileWriter::prepareToPlay (double sampleRate, int samplesPerBlock)
{
if (filePath != "")
{
File file (filePath);
if(file.existsAsFile()) file.deleteFile();
outputStream = new FileOutputStream(file);
fileWriter = wavAudioFormat.createWriterFor(outputStream, 44100, 2, 24, wavAudioFormat.createBWAVMetadata("","","",Time::getCurrentTime(),0,""), 0);
if (fileWriter != nullptr)
{
outputStream->flush();
}
}
node->properties.set("file", filePath);
}
void FileWriter::processBlock (AudioSampleBuffer& buffer, MidiBuffer& midiMessages)
{
const int totalNumInputChannels = getTotalNumInputChannels();
const int totalNumOutputChannels = getTotalNumOutputChannels();
int numSamples = buffer.getNumSamples();
fileWriter->writeFromAudioSampleBuffer(buffer, 0, numSamples);
filePosition += numSamples;
if(filePosition % 44100 == 0) outputStream->flush();
}
`
These two methods get called by something in the plugin host example, but what exactly? And they get called in a fashion that the file gets read, written and and played in realtime, not as fast as possible. I haven’t found a line of code where those methods get called, so it must be something that is happening somewhere inside the JUCE classes and is inherent to their functioning, right? And if not, I’d gladly learn what or where it is that runs that thread so I could turn it off and replace with that kind of solution you suggested
I’ll most probably need some clarification for that also, but for now I’m stuck in the very beginning in understanding how this plugin host example / JUCE does it’s thing of realtime rendering of the audio from any given graph; even the one’s that do not contain any AudioIODevices or VST/AU’s, just custom made / built in AudioProcessors.
I’ve tried to setNonRealtime for each and every one of the nodes in the graph, as well as the main AudioProcessorGraph that holds all the nodes in the graph in FilterGraph.cpp, but to no avail. After setting the nodes and the graph will inform me that they’re in an offline mode when I call that isNonRealtime-function, but they’re still running realtime as usual. I’m not, and I were not expecting those setNonRealtime/isNonRealtime to actually do the magic as such, without any user implementation for the actual rendering part, just saying that I’ve tried that also out of curiosity / desperation