Delay issue with certain milliseconds

Hey there, I’m quite new to the realm of audio development and still am trying to learn the basics, I made this simple delay plugin in JUCE here is the core processBlock code:

for (int channel = 0; channel < totalNumInputChannels; ++channel)
        float* channelData = buffer.getWritePointer(channel);

        const int bufferLength = buffer.getNumSamples();
        const int circBuffLength = circBuff.getNumSamples();

        const float* mainReadPointer = buffer.getReadPointer(channel);
        const float* circBufferReadPointer = circBuff.getReadPointer(channel);

        float* circBuffWrite = circBuff.getWritePointer(channel);
        const float delayInSamples = delayTime * getSampleRate() / 1000;
        for (int i = 0; i < bufferLength; ++i)
            float fIndx = writePosish - delayInSamples;
            if (fIndx < 0)
                fIndx+= circBuffLength;
            float sIndx = writePosish - (delayInSamples+1);
            if (sIndx< 0)
                sIndx += circBuffLength;

            float frac = delayInSamples - int(delayInSamples);

            circBuffWrite[writePosish] = (float(50.0 / 100.0)*((frac*circBufferReadPointer[int(sIndx)]) + (float(1.0-frac)* circBufferReadPointer[int(fIndx)])))+mainReadPointer[i];
            if (writePosish == delayBuffLength)
                writePosish = 0;
            channelData[i] = (float(50.0 / 100.0) * ((frac * circBufferReadPointer[int(sIndx)]) + (float(1.0 - frac) * circBufferReadPointer[int(fIndx)]))) + mainReadPointer[i];

and prepareToPlay:

void PluginPrototypeAudioProcessor::prepareToPlay (double sampleRate, int samplesPerBlock)
    const int numInputChannels = getTotalNumInputChannels();
    // Use this method as the place to do any pre-playback
    // initialisation that you need..

    writePosish = 0;
    delayBuffLength = 2 * sampleRate;
    circBuff.setSize(numInputChannels, delayBuffLength);


the delay seems to be working fine on some millisecond values and not in others, for instance 720 works fine but 714 gives me bit reduced filthy noise like delay tails, also every value seems to be working half the value it shows, like 200 milliseconds produces a 100 milliseconds delay, I was wondering if anyone would be kind enough to point out the errors and bugs in my poorly written beginner code, thanks in advance!

when using floats, always finish with an .f to tell the compiler that this number should be a floating point and not a double or int. Casting it into a float later on could cause issues when the compiler turns your numbers into ints.
50.0f / 100.0f instead of float(50.0/100.0)
When I made a delay effect I used the juce delayline class. The stuff you do with the circBuff object seems unnecessarily complicated imo. Maybe this already solves your issues.

my personal guess is that it’s just because you use linear interpolation [green]. there are a lot of different interpolation methods that try to make those fractional buffer accesses round rather than straight reducing in way less edgy transitions and theirfore a smoother sound [purple]. google cubic hermite spline for a really easy interpolation method. check out lanczos sinc for a really smooth interpolation method. you can also find interpolation helpers in juce’ dsp classes


you can find out if it’s really because of your interpolation method by adding a line
DBG(frac); in your code after the line where you calculated frac. if those artefacts become worse the more the value goes towards 0.5 it really is because of the interpolation, becase the more the value is not 0 or 1 the more the interpolation has to guess a value.

Of course I could always use the Juce dsp for a much better performance but I’m trying to learn the very basics for now so I wanted to get the hang of how a circular buffer functions. Thanks for the reply!

Thank you! I changed the interpolation to cubic Hermite spline and I still had the same issue, well after much trial and error seems like the outer for loop (which is for iterating between the left and the right channels) was the problem! I ended up making write and read pointers for both channels and used them inside the buffer loop and got rid of the outer loop and the problem was solved! I can’t quite wrap my head around why the outer loop would cause such a problem but I guess there is something weird going on with how the write index and the read index get calculated inside the outer loop. Thanks for mentioning the interpolation methods though, it was so informative and helpful.

oops, i should have seen that. yeah, ofc. and you can even further improve it by only having 1 writeHead but still individual readHeads for each channel. because if you think of it, the writehead should be the same for both channels’ buffers anyway, only the readHead defines the delay’s length. you could fill a vector with writeHead data before going into the channel-sample-loops. in these loops you’d just access the already-baked writeHead data to calculate readHead from your channels’ delay time and then apply the delay itself with it

1 Like