Delay Line artifacts

Hello I have been trying to make a variable delay line in juice but for some reason I get artifacts when I change the delay time. I added an interpolation method and smoothed the incoming values but that didn’t seem to work. Please help

//Init Before Software Starts
void JaneDelay::init(float sampleRate)
{
    //Set the sample Rate
    SampleRate = sampleRate;
    
    //Set the Max delay Size
    SIZE = MAXDELAY * sampleRate;
    
    //Create the Delay Buffer
    delayBuffer = new float[SIZE];
    
    setDelay(0.25f, 0.0f, 0.0f);
    
}

void JaneDelay::setDelay(float time, float width, float mod)
{
    //Set time == samplerate
    time *= SampleRate;
    
    //Make Sure Time Isn't greater than max size
    Time = time < SIZE ? time : SIZE - 1;

    //Set Width == to SampleRate
    width *= SampleRate;

    //Create Fraction Component
    frac = time - (long)time;
}

void JaneDelay::process(float *inbuffer, int numSamples)
{
    
    for (int i = 0; i < numSamples; i++)
    {
        delayBuffer[writePointer++] = inbuffer[i] + output * feedBack;
        
        //Set Read Pointer
        readPointer = writePointer - Time;
        
        if (readPointer < 0)
            readPointer += SIZE;
        
        
        float a = delayBuffer[readPointer];
        float b;
        if (readPointer + 1 > SIZE)
            b = delayBuffer[0];
        
        else
            b = delayBuffer[readPointer + 1];
        
        output =  a + (b - a) * frac;
        
        inbuffer[i] = output;
        
        if (writePointer >= SIZE)
            writePointer -= SIZE;
        
    }
}

Sorry for the block of text I just can’t figure out why this isn’t working. If you want me to post anymore of the code please let me know!

You get artefacts because you introduce discontinuities in the signal whenever you change the length of the delay time. I’m not sure what the best approach here is, but I guess smoothing the changes to the delay times might help. Although I fear it might only make things a little more bearable than offer a ‘fix’.

You could also quickly fade the output of the delay to zero just before you change the delay time. tbh, I’m not sure how commercial plugins do this.

IMHO there are two strategies:

  • jumping: if the time changed pull first with the old time and copy with fade out and then fetch again with the new time adding it to the signal with a fade in

  • resampling: you have a smoothed value of the actual delay. When the value is different from the actual delay you resample the samples so it fits the requested samples. That way the audio is squeezed (pitched) like with an tape-delay when the read head is moving

It depends on use case which one to choose or let the user decide.

2 Likes

Ok I’ll look into both of these ideas. Thanks for all the support from both of you!

actually quite simple. you have a line saying:

readPointer = writePointer - Time;

you just have to apply a really strong lowpass filter on Time before going into that line and you’re done. make sure Time and readPointer are float values so that you can use interpolation instead of directly reading from delayBuffer with it.

also be careful with the line in your init function where you new float[SIZE] on the delayBuffer. prepareToPlay can be called a lot of times. it’s safer to just use a vector

Or rather a juce::AudioBuffer<float> which is designed for exactly that purpose and has all juce specific interfaces, which is not the case for the general purpose vector.

1 Like

I just tested this out of pure curiosity, and it works very well. There is a slight detuning of the signal, but it’s a far sight better than the glitchy zipper noise you get without it :wink:

1 Like

Thanks for the suggestion. I tried it and it worked really well, however this gave me an idea of how I could improve the code. I realized since the low pass filter is just smoothing the value like a capacitor I decided to try and implement a short function to help smooth out the values.

void JaneDelay::process(float *inbuffer, int numSamples)
{
    
    for (int i = 0; i < numSamples; i++)
    {
      //(Most of the function hasn't changed I just added this and made reading and wring to the delay its own function)

        //Create a local varible for what the target should be
        float localTargetTime = Time;
        
        //Create slew with change in the delay time
        if (localTargetTime != currentTime)
        {
            float timeInc = (localTargetTime - currentTime) / (SampleRate);
            currentTime += timeInc;
        }
        
        //Read from delay Line
        Output = read(currentTime + lfo + 1.0f);
}
}

This approach has also worked really well and then you don’t need to fiddle with filter parameters to get it working.

The final thing I observed is that when I changed the interpolation method to use the current sample and the last sample it worked a lot better. I don’t really know why, I don’t think it should have made a difference in theory, maybe there was more I fixed but the code works now so we take those lol.

As it is written on the documentation for the juce::DelayLine

Note: If you intend to change the delay in real time, you may want to smooth changes to the delay systematically using either a ramp or a low-pass filter.

But also the ‘jumping’ approach suggested by @daniel should work! :slight_smile:

For @daniel : might I ask you what you mean with this technique?

Is this just ramp/smoothing + lowpass on the time/samples delay variable? Or am I not understanding something?

Basically yes. Instead of low-pass I rather use SmoothedValue, but effectively that’s the same.

The above algorithm only works if you adapt the time each sample and you do the interpolation by hand.

What I do instead is adapt per block and use the juce::LagrangeInterpolator. But working in smaller chunks makes probably sense.

1 Like

IANM, using DelayLine::pushSample and DelayLine::popSample should already works fine.

Ok, interesting. So whenever the delay time changes, you ‘resize’ the delay buffer with the Lagrange Interpolator and then go on with the processing?

Yes, I think that is the better way to go. It wasn’t available when I wrote my CrazyDelay 4 years ago

1 Like

i mean, the pitch drifts are part of the charme of a good delay, aren’t they? :slight_smile:

2 Likes

Interesting, I always used some sort of smoothing for the read index and lagrange interpolation to read from the buffer without resizing. I guess it depends on how often your delay is modulated which one is better in performance, resizing or resampling on the fly.

The choice of smoothing has a big impact on how the modulated delay will sound, e.g. I had nice analog style results using a physics model with velocity and acceleration. If you imagine a tape delay, the choice of smoothing affects how your tape’s read head speeds up / slows down when changing delay time :nerd_face:

2 Likes

resampling the whole buffer is kind of impossible once you start getting into per-sample delay length modulation and any decent sized delay line, source: my first crappy attempt to make a LFO modulated delay. :laughing:

For OPs original question, don’t roll your own, just use dsp::DelayLine with pushSample and popSample methods and smooth the delay time parameter, it does exactly what you want.

1 Like

Yes, for the same reason I do resampling on the fly. Resizing never came to my mind actually, I just read it here :wink:

1 Like

I know that there are way easier ways of doing it but I’m really interested in the idea of not only learning dsp but creating devices using embedded systems. I’m sorta just using JUCE as a way to test these algorithms before putting them onto MCUs.

1 Like