Delay Buffer

Hi,

I’m just getting started on using Juce for developing plugins. I’ve got the demo working and I’m now trying to incorporate a slider for adjusting the delay time. I’m basically adjusting the size of the delay buffer according to the slider value. Is this the correct way for doing this? The problem I’m having is that when I quickly adjust the delay time using the slider from max to min, it crashes. Any help is appreciated.
Thanks…

Here’s my processing code. It’s pretty much the same as the demo with a few adjustments…

[code]
void Plugin1AudioProcessor::processBlock (AudioSampleBuffer& buffer, MidiBuffer& midiMessages)
{
const int numSamples = buffer.getNumSamples();
int dp = 0;

//Calculating number of delay samples....
    //sr = sampling rate == 44100,  delay = newValue in the setParameter function.
    //Therefore max delay time is 1 second
delaySamples = (int)(sr * delay);            

//dynamically resizing the delay bufffer
delayBuffer.setSize(2, delaySamples, true, false, true);

 // This is the place where you'd normally do the guts of your plugin's
// audio processing...
for (int channel = 0; channel < getNumInputChannels(); channel++)
{
    //------------applying gain to signal....
	buffer.applyGain(channel, 0, buffer.getNumSamples(), gain);	


//------------applying delay to signal....
float* channelData = buffer.getSampleData (channel);
    float* delayData = delayBuffer.getSampleData (jmin (channel, delayBuffer.getNumChannels() - 1));
    dp = delayPosition;

    for (int i = 0; i < numSamples; i++)
    {
        const float in = channelData[i];
        channelData[i] += delayData[dp];
        delayData[dp] = (delayData[dp] + in) * 0.5;		//0.5 for half of the amplitude as the original signal
        if (dp++ > delayBuffer.getNumSamples())
            dp = 0;
    }
}

delayPosition = dp;


// In case we have more outputs than inputs, we'll clear any output
// channels that didn't contain input data, (because these aren't
// guaranteed to be empty - they may contain garbage).

for (int i = getNumInputChannels(); i < getNumOutputChannels(); ++i)
{
buffer.clear (i, 0, buffer.getNumSamples());

}

}[/code]

I think I might have fixed the initial problem I was having with the delay buffer. I just altered the setsize() function to the following

It now deletes the old data which seems to stop it from crashing.

I am getting a crackling sound in the audio though when the delay time is set small. Any suggestions for “smoothing” this out?

Thanks.

If only a I had a dollar every time someone asked this question…

…I think if Jules would just add a “FractionalDelayLine” class this would settle the matter once and for all.

Fractional Delay Lines

Ok I think I’m starting to understand fractional delay lines. Basically if my delay sample size is not an integer multiple of the sampling rate I need to deal with the remainder?

For instance, lets say my sampling rate is 44100 and my current delay is 20000 samples.

(44100 / 20000) = 2.205

Therefore I need to address this .205 part by increasing or decreasing my delay sample size to the nearest integer multiple of 44100???

Or have I got it totally wrong?

Thanks…

You address it by using interpolation:

If you are setting your delay time in samples you shouldn’t have a problem as the number of samples appears to be an integer (from your example). If you use real world times to set your delay you may end up with a fractional number of delay samples. This is where you will need to interpolate as TheVinn says.

To be honest I would get your delay buffer working without the interpolation first as that is an easy step to add afterwards. just truncate the frational part by casting it to an int.

In your example you can delay by 20000 samples easily, thats just 0.45… seconds.

Yeah, I don’t think your problem has anything to do with fractional delays. It just looks to me like you’re not thinking about what actually happens when you resize your buffer - it’s a circular buffer, so resizing it will completely screw up the data. The only (easy) way to make an adjustable-time delay is to use a fixed-size circular buffer, and change the offset at which you read from it.

1 Like

I was beginning to think it was just an amplitude issue as I had addressed the fractional part but was still producing clipping or “glitches”. I’ll try your suggestion Jules.
Thanks.

I tried the delay effect using a circular buffer. So the mod operator takes care and gets rid of the fractional part right? It would be great if some one could suggest me how good the following implementation is. An issue i’m facing is, i’m hearing glitches just on one channel (left or right at a time) even while i’m not dynamically changing the buffer size.

    float dbuf[100000] ; //delay buffer (circular)
    int dw = 0; //write pointer
    int dr = 1; //read pointer

//main loop inside getNextAudioBlock function:

                    for (int sample = 0; sample < bufferToFill.numSamples; ++sample)
                    {
                        dbuf[dw++ % ds] = inBuffer[sample];
                        outBuffer[sample] = dry * inBuffer[sample]+ wet * dbuf[dr++ % ds];
                    }
2 Likes