I’m new to creating Audio Plugins and I need help setting up a Reverb. I looked up the dsp::Reverb class documentation but it doesn’t include details of how to implement it or what the parameters are? Any help or advice will be greatly appreciated.
Not an expert here, but as far as I can tell the dsp::Reverb is just a wrapper around an older class - Reverb, to make it easy to include inside a dsp-processor-chain.
Looking at the documentation of the juce Reverb class you can find a reference to the Reverb::parameter struct
There are all parameters listed you can set.
So you create an instance of a Reverb::parameter struct, set its values and pass it to your reverb
maybe that helps!
//initiate once in your setup
theReverb.setSampleRate(44100); // improve: ofcourse use the real samplerate here!
theReverbParameters.dryLevel = 0.2;
theReverbParameters.wetLevel = 1.0;
theReverbParameters.roomSize = 1.0; // 0 small to 1 big
theReverbParameters.damping = 0.1; // 0…1 1=damped
// now you can process float arrays l & r thus:
theReverb.processStereo(l, r, numSamples);
Hello. A newbie here. I have the same problem. As I understand dsp::Reverb class is a wrapper for juce::Reverb class, created in order to use juce::Reverb in ProcessorChain. Should I somehow put juce::Reverb into dsp::Reverb or are they already connected?
If you are using juce 6 you would have to put juce:: in front of Reverb and Reverb::Parameters.
Just so no “newbie” gets confused
You don’t need to create a Reverb on your own. Creating a dsp::Reverb contains a Reverb object!
So just create a dsp::Reverb and work with it in your dsp Chain (or however you want to use it)
if its an AudioApp then you have to go into prepare to play and make a AudioProcess spec from the dsp module and pass it the parameters from prepare to play and from the audio device manager then pass that spec into the juce::dsp::Reverb’s prepare method
then in the getNextAudioBlock you’ll want to create a juce::dsp::AudioBlock and pass that audio block the buffer from the bufferToFill in the getNextAudio block
then you can either implement the process method in your AudioApp’s Main Component but be sure to add it to the header file as well.
when you implement the process method you’ll want to use juce::dsp::ProcessContextReplacing and pass that the audio block from getNextAudioBlock
then in the process method you can set the parameters for the reverb parameters but you need to make the object itself
params.damping = dampSlider.getValue(); params.dryLevel = drySlider.getValue(); params.freezeMode = freezeSlider.getValue(); params.roomSize = roomSlider.getValue(); params.wetLevel = wetSlider.getValue(); params.width = widthSlider.getValue();
like that if your using a slider or if you’re doing a manual input than just the numerical values
then call your reverbs process method and pass it the ProcessContextReplacing object
reverbObject.process(context); //ProcessContextReplacing context is the parameter to input and the parameter from the process override
also if you’re doing in the plugin it will be similar except you’ll have to pass the buffer from processBlock in the contextReplacing instead of bufferToFill and you’ll have to use
getNumOutputChannels in the prepareToPlay when you instantiate the ProcessSpec variables which are the sampleRate, numSamples and numOfOutputChannels… I don’t think those are the exact names but they describe the variables intuitively.
I forgot to also say add the reverb parameters to the reverb object itself using the reverbs parameter setter otherwise it won’t know to update the parameters.
and you can call the process of the reverb without creating an entire method for process in getNextAudioBlock or in processBlock but the steps are essentially redundant.