Problem with finding magnitude of samples

Hello everyone. First post! I’m trying to make a simple plugin that creates a scrolling waveform, or histogram. I do this by getting the magnitude of the buffer in my processBlock and using it to determine the heights of an array of rectangles (each 1px wide) in the PluginEditor. It works smoothly in JUCE’s AudioPluginHost, but problems emerge when I open it in my DAW (FL Studio):

The histogram looks really blocky, and the amplitude values I’m getting in my processBlock are interrupted by (seemingly) random values that do not reflect the sound that’s actually passing through.

Here’s the original code I wrote in my pluginProcessor:

void JUCE008_PathAudioProcessor::processBlock (juce::AudioBuffer<float>& buffer, juce::MidiBuffer& midiMessages)
{
    juce::ScopedNoDenormals noDenormals;
    auto totalNumInputChannels  = getTotalNumInputChannels();
    auto totalNumOutputChannels = getTotalNumOutputChannels();

    for (auto i = totalNumInputChannels; i < totalNumOutputChannels; ++i)
        buffer.clear (i, 0, buffer.getNumSamples());

    for (int channel = 0; channel < totalNumInputChannels; ++channel)
    {
        auto* channelData = buffer.getWritePointer (channel);

        for (int sample = 0; sample < buffer.getNumSamples(); ++sample)
        {

            mAmplitude = buffer.getMagnitude(0, buffer.getNumSamples());//get the amplitude for use by the visualizer

        }
    }
}

I suspect that the problem is due to the fact that the DAW sometimes passes very few values to my buffer. I also suspect that the way to fix this is to create a circular buffer—but this is where my lack of experience gets the best of me.

Here is my unsuccessful attempt at fixing the problem with a circular buffer:

void JUCE008_PathAudioProcessor::prepareToPlay (double sampleRate, int samplesPerBlock)
{
    const int numInputChannels = getTotalNumInputChannels();
    const int delayBufferSize = 2 * (sampleRate + samplesPerBlock);//= 2 seconds of playback history

    mDelayBuffer.setSize(numInputChannels, delayBufferSize);
}

void JUCE008_PathAudioProcessor::processBlock (juce::AudioBuffer<float>& buffer, juce::MidiBuffer& midiMessages)
{
    juce::ScopedNoDenormals noDenormals;
    auto totalNumInputChannels  = getTotalNumInputChannels();
    auto totalNumOutputChannels = getTotalNumOutputChannels();

    for (auto i = totalNumInputChannels; i < totalNumOutputChannels; ++i)
        buffer.clear (i, 0, buffer.getNumSamples());

    const int bufferLength = buffer.getNumSamples();
    const int delayBufferLength = mDelayBuffer.getNumSamples();

    for (int channel = 0; channel < totalNumInputChannels; ++channel)
    {
        const float* bufferData = buffer.getReadPointer(channel);
        const float* delayBufferData = mDelayBuffer.getReadPointer(channel);

        fillDelayBuffer(channel, bufferLength, delayBufferLength, bufferData, delayBufferData);
    }

    mWritePosition += bufferLength;
    mWritePosition %= delayBufferLength;

    mAmplitude = mDelayBuffer.getMagnitude(0, mDelayBuffer.getNumSamples());//this results in catastrophic failure!
}

void JUCE008_PathAudioProcessor::fillDelayBuffer(int channel, const int bufferLength, const int delayBufferLength, const float* bufferData, const float* delayBufferData)
{
    //copy data from main buffer to delayBuffer
    if (delayBufferLength > bufferLength + mWritePosition)
    {
        mDelayBuffer.copyFromWithRamp(channel, mWritePosition, bufferData, bufferLength, /*startGain!*/1.0, 1.0);
    }
    else
    {
        const int bufferRemaining = delayBufferLength - mWritePosition;

        mDelayBuffer.copyFromWithRamp(channel, mWritePosition, bufferData, bufferRemaining, 1.0, 1.0);

        mDelayBuffer.copyFromWithRamp(channel, 0, bufferData, bufferLength - bufferRemaining, 1.0, 1.0);
    }
}

Sorry for the really lengthy post—it just goes to show how dumbfounded I am by this problem. I’m still very much a novice, especially to the idea of buffers—so any help would be truly appreciated.

In a word, no. The DAW will pass exactly as many samples in each buffer as the audio buffer size in the DAW is set for (512, 256, 128, etc). Now it could be that on your system, AudioPluginHost and FL Studio are set for different buffer sizes – there’s no reason they would necessarily be the same – and that difference could account for the different graph outputs you are seeing.

This bit of code is a red flag:

        for (int sample = 0; sample < buffer.getNumSamples(); ++sample)
        {
            mAmplitude = buffer.getMagnitude(0, buffer.getNumSamples());//get the amplitude for use by the visualizer
        }

The AudioBuffer method getMagnitude is meant to be called once on a buffer, but here you are looping over the buffer sample-by-sample, and calling getMagnitude each time!

Also I would guess that mAmplitude is a member of your Processor class, but I don’t see where it’s getting reset to zero between buffers. And without seeing what’s happening in the Editor that’s drawing the graph, it’s unclear how that amplitude data is being passed to the graphing code.

prepareToPlay gives you the max block size as 2nd argument. every audio block sent to processBlock is smaller or equal that number. if you want to capture certain information in a specific constant time interval you have to roll your own indexes or ringbuffers etc like you already assumed