Delay output is distorting

I’m trying to create a simple delay plugin. It has a working feedback, input and output gain, dry/wet knob, but for some reason the sound distorts. Even when turning either gain knob, the signal becomes quieter but it still has the digital distortion sound.
This is my process block. Thanks in advance!

    for (int channel = 0; channel < totalNumInputChannels; ++channel)
    auto* channelData = buffer.getWritePointer (channel);        
    auto* delayData = delayBuffer.getWritePointer(juce::jmin(channel, delayBuffer.getNumChannels() - 1));

    dpr = delayReadPosition;
    dpw = delayWritePosition;
    for (int i = 0; i < numSamples; ++i) {
        const float dry = channelData[i];
        const float in = applyGain(dry, inGain);
        float wetSignal = 0.0;
        float out = 0.0;

        long rpi = (long)floor(delayData[dpr]); //linear interpolation
        wetSignal = delayData[dpr] - (double)rpi;
        double dryOut = dry * (1.0 - dryWet);
        double wetOut =  wetSignal * dryWet;
        out = applyGain((dryOut + wetOut), outGain); 

        delayData[dpw] = in + (wetSignal * feedback); 

        if (++dpr >= delayBufferLength)
            dpr = 0;
        if (++dpw >= delayBufferLength)
            dpw = 0;

        channelData[i] = out;

Where do you set the value for numSamples in your for loop? The length of the buffer can and will change with each buffer (the number given in prepare To Play is the max size). Always use buffer.getNumSamples() when processing like that to ensure you are processing the correct amount of samples.

I use buffer.getNumSamples() before the for loop.

If I understood your code correctly, then I believe you should make sure that your delayBuffer has at least the same number of channels as your buffer, instead of doing the “juce::min” thing when you obtain the delayData pointer.

Consider this scenario: buffer is stereo and delayBuffer is mono.

  • On the first pass of your for, for the first channel in buffer you write delayData (hence the first channel in delayBuffer) with samples that come from the processing of the first channel in buffer.
  • On second pass of your for, for the second channel in buffer, your delayData again points to the first (the only) channel in delayBuffer which now has its samples already updated from the first pass. Your processing of the second channel in buffer happens with delay data that was the result of the processing of the first channel in the current callback, not with data that came from processing of the second channel from previous callback as it should have.

In this scenario, this is an instance of the #1 most common programming mistake that we see on the forum