As title says. Here’s the relevant code snippets. This was originally supposed to be an all-pass filter, but I’m using coefficients for a band pass filter just for the sake of more obvious results. I should also clarify that I’m not getting any errors or messages from VS. It’s just passing through audio unaffected.
setup in PluginProcessor.h:
dsp::ProcessorDuplicator<dsp::IIR::Filter<float>, dsp::IIR::Coefficients<float>> apf;
apf.state->dsp::IIR::Coefficients<float>::makeBandPass(sr, 500); //arbitrary frequency for debug
void RubberAudioProcessor::prepareToPlay (double sampleRate, int samplesPerBlock)
sr = sampleRate;
bufferSize = samplesPerBlock;
numChannels = getMainBusNumInputChannels();
spec.sampleRate = sr;
spec.maximumBlockSize = bufferSize;
spec.numChannels = numChannels;
void RubberAudioProcessor::processBlock (juce::AudioBuffer<float>& buffer, juce::MidiBuffer& midiMessages)
auto totalNumInputChannels = getTotalNumInputChannels();
auto totalNumOutputChannels = getTotalNumOutputChannels();
// In case we have more outputs than inputs, this code clears any output
// channels that didn't contain input data, (because these aren't
// guaranteed to be empty - they may contain garbage).
for (auto i = totalNumInputChannels; i < totalNumOutputChannels; ++i)
buffer.clear (i, 0, buffer.getNumSamples());
//create AudioBlock for output buffer
//I recognize that this is probably redundant,
//I just wanted to be extra sure that the processed audio
//is making it's way to the output.
//as it is now, there is no difference if one or both of these lines is removed
Well, there is a simple reason: after processing you clear your results…
“buffer” contains the samples to process (= storage) AND is the destination to return them after processing. This is often called in-place processing.
By calling buffer.clear () you clear the results from your DSP processing.
What is probably confusing you is that a block is not a kind of buffer… It is a way of referring to samples in a buffer.
In your declaration
you are essentially mixing up the concepts of storing and referencing.
A tip: try to use the class name in the instance name, if possible. I might simply use “block” for the instance here. And on naming conventions: the name “buffer” is not helping you to recognize its function. If you use “src” for input buffers, “dest” for output buffers and “srcDest” or “inOut” for in-place work, you make your code more clear, even to yourself.
(I find this class name a very bad choice and avoid its entire concept from dsp:: wherever I can… and I am not the only one).
And leave out those last two lines.
By using the block you have already processed the buffer…
PS: the location for the updateAPF () is not OK, but first on the “no sound” issue
Thanks for the response. So removing those last two lines in the process block should solve the issue, right? Because the only reason I added those lines in the first place was to try and solve the same problem. ie audio still passes through unaffected whether those lines are there or not
I have only commented on the errors contained in these lines. First steps first. If you clear results and there was also an issue with those results, that is another topic.
Sorry, I misunderstood. That is the case though; clearing those lines still leaves me in the same place. Is it perhaps related to what you mentioned about updateAPF() being in a bad location?
As long as you show no code where coefficients are changed, there is no need to call a filter update while using the filter. You are just (at top speed) calling filter calculations again and again…
But do you now understand the difference between an AudioBlock and an AudioBuffer?
I think I do understand the difference, yes. AudioBuffer is exactly that, while AudioBlock is a reference the sample/channel information contained in an AudioBuffer as a single object. Is that correct?
I do intend to allow coefficients to be updated in real-time, I’m just trying to get the filter working on its own first. That’s why I mentioned that the 500 argument is an arbitrary frequency for the sake of debugging
OK with the block/buffer, I just wanted to be sure it is clear (because of your naming mix-up).
The filter calculations should not be called in the processBlock, real-time requirements or not, you will be re-calculating them way too often and thus waste CPU time.
Filters should be updated when their parameters (say Freq and Gain) change, triggered either from a host or a GUI, typically in another thread. The processBlock must apply the filter coefficients to the input.
Maybe you can explain your “real-time” requirement more in detail?
Ah I see what you mean. So originally that function was this
apf.state->dsp::IIR::Coefficients<float>::makeBandPass(sr, *freq, *q);
with *freq and *q pointing to the relevant raw parameter values.
But it sounds like that would be better to do in sliderValueChanged() (or something similar)? I’m not certain
The freq and q pointers will be changed/updated in the sliderChanged, parameterChanged, etc methods. When changed, you update the filter and the changed/updated coefficients will be used by the filter in the processBlock method.
So only when one of those pameters has changed, you call the updateAPF method.
You may need a very short lock when the filter state / coefficients are changed, otherwise process () may run while they are changed
Thanks, that’s really helpful to know.
That still leaves me in the same place though, with the filter ostensibly not doing any processing (regardless of whether any parameters are involved or not)
I’m having the same issue. dsp::IIR::Filter isn’t affecting the audio signal.
Every other processor in the same dsp::ProcessorChain is working as intended. This includes dsp::Compressor, dsp::Limiter and multiple instances of dsp::Gain. The parameters update properly and the sound is affected through these.
The dsp::IIR::Filter gets it’s coefficients set up. I even made a debug to verify that my coeficients are applied properly:
if (sampleRate > 0)
newCoefficients = juce::dsp::IIR::Coefficients<float>::makePeakFilter(sampleRate, bandSettings.freq, bandSettings.quality, Decibels::decibelsToGain( bandSettings.gain ));
mainChain.get<ChainPositions::EQ>().get<0>().state = *newCoefficients;
auto dgbCoefPost = String(mainChain.get<ChainPositions::EQ>().get<0>().state->getMagnitudeForFrequency(2000, sampleRate));
DBG("Coefficients: " << dgbCoefPre << " State: " << dgbCoefPost);
Debug outputs proper getMagnitudeForFrequency() before and after I assign the value. It’s just that the actual audio isn’t affected. I do 24db boosts at 1000 to 2000 hz for testing purposes.
Where should I take a look to understand what’s wrong?