Smooth Attack/Release Gain Changes for Compression

Hi everyone, I am working on a compressor using the juce::dsp::BallisticsFilter<float> for level detection, and the juce::SmoothedValue<float, juce::ValueSmoothingTypes::Linear> for implementation of gain changes.

It seems, however, that regardless of what I do to change the attack and release times in the BallisticsFilter, the changing of the attack and release times do not affect the timing of the onset and offset of gain reduction, as confirmed by DDMF’s PluginDoctor.

The only thing that seems to change the timing of the onset and offset of gain reduction is the time constant in this line of code:

gain.reset(lastSampleRate, 0.400); //The 400 ms, specifically

When I switch the gain reduction method to sample-by-sample processing, then I get a zippering distortion, which is unpleasant.

Am I implementing the ballistics filtering incorrectly, or am I using the smooth gain adjuster incorrectly for my intended purpose?

Here is an example of the code I am using:

//**Prepare To Play**
juce::dsp::ProcessSpec spec;
spec.sampleRate = sampleRate;
spec.maximumBlockSize = samplesPerBlock;
spec.numChannels = getTotalNumInputChannels();

gain.reset(lastSampleRate, 0.400);

//**Process Block**

float inputPeakLevel = 0.0f;
int numSamples = buffer.getNumSamples();
for (int channel = 0; channel < totalNumInputChannels; ++channel) {
    auto* channelData = buffer.getWritePointer(channel);
    for (int sample = 0; sample < numSamples; ++sample) {
        inputPeakLevel += ballisticsFilt.processSample(channel, channelData[sample]);

float envelopeDB = 20.0f * std::log10f(inputPeakLevel / numSamples);
float makeup = 1.0f;

//Gain Reduction Calculator... which adjusts makeup based on envelopeDB

//Final Gain Change
gain.applyGain(buffer, numSamples);

Thank you for your help!