AudioProcessor::prepareToPlay() is called with sampleRate=0.0 and estimatedSamplesPerBlock=0

Hello,

I am implemnenting my special AudioProcessor to use it as a part of special audio channel. I need to set up a number of resources to make my processor capable of sourcing audio data. I tried to use AudioProcessor::prepareToPlay() as a point to do the job. However, I discovered that AudioProcessor::prepareToPlay() is always called with sampleRate=0.0 and estimatedSamplesPerBlock=0. I debugged a bit the code and I found that the reason is in void AudioProcessorGraph::AudioGraphIOProcessor::setParentGraph (). There is the following call applied to the object which instantiates the Audio Channel, in which I agregate my own processor:


        setPlayConfigDetails (type == audioOutputNode ? graph->getMainBusNumOutputChannels() : 0,
                              type == audioInputNode  ? graph->getMainBusNumInputChannels()  : 0,
                              getSampleRate(),
                              getBlockSize());


Simply saying, getSampleRate(),  getBlockSize()) leaves unchanged the relevant values what finally results in sampleRate=0.0 and estimatedSamplesPerBlock=0 constantly passed to my processor.

 

Is it a bug or I should use a different way to get properly reported sample rate and buffer size.

Thanks
 

Doesn't sound like a bug to me.. If the sample rate/block size aren't known then it's OK for things to set them to 0. 

The problem is that in fact the parent graph has already reported non 0 values. E.g. SR=44100 and BS=1024. And my problem is that I can never get these in my AudioProcessor.  Is there any way to get the values before using them in processBlock? Thanks

 

Sorry, really struggling to see what you mean. The graph must be reporting the correct sample rate to plugins, otherwise the demo host wouldn't be able to run plugins correctly (?)

Yes, I see things exactly that way you mention. However, in my case I have the SR = 0, and BS = 0. What I do is:

I have AudioProcessorGraph, (let's call it "A"), which has a number of other AudioProcessorGraphs (let's call them "B"), added as nodes. In each  "B" graph I have a processor (let's cal it "C").

 "A" graph is set as processor of a AudioProcessorPlayer and the AudioProcessorPlayer is set as audioCallback of AudioDeviceManager.

"A" graph receives correct values in AudioProcessorGraph::prepareToPlay(double newSampleRate, int estimatedSamplesPerBlock) and everything is fine there. However, when add "C" to "B" and then add "B" to "A", i have sampleRate=0 and estimatedSamplesPerBlock=0 in prepareToPlay called for "C". The call stack looks like this:

WTMMClipLaunch (this is "C" in my description) ::prepareToPlay(double sampleRate, int estimatedSamplesPerBlock) Line 129    C++
juce::AudioProcessorGraph::Node::prepare(double newSampleRate, int newBlockSize, juce::AudioProcessorGraph * graph, juce::AudioProcessor::ProcessingPrecision precision) Line 973    C++
juce::AudioProcessorGraph::buildRenderingSequence() Line 1337    C++
juce::AudioProcessorGraph::handleAsyncUpdate() Line 1374    C++
juce::AsyncUpdater::AsyncUpdaterMessage::messageCallback() Line 34    C++
juce::WindowsMessageHelpers::dispatchMessageFromLParam(__int64 lParam) Line 49    C++

It happens when the AudioProcessorGraph::AudioGraphIOProcessor::setParentGraph () calls setPlayConfigDetails () the way, metioned in my post (with SR=0 and BS=0). I might missed thinking the call to setPlayConfigDetails () is relevant.

In addition, I would mention that "C"::prepareToPlay is called prior to "A"::prepareToPlay and "A"::prepareToPlay receives correct values.