Making a cross delay

I’m trying to make a cross delay effect (left channel gets delayed into right channel and v.v.) using JUCE::DelayLine.

Here’s my inner audio processing loop:

mixer.pushDrySamples (input);

for (int channel = 0; channel < numChannels; ++channel)
	auto* samplesIn = input.getChannelPointer (channel);
	auto* samplesOut = output.getChannelPointer (channel);

	for (size_t sample = 0; sample < input.getNumSamples(); ++sample)
		auto input = samplesIn[sample];
		auto delayAmount = delayValue[channel];

		// swap l/r
		if (channel == 0)
			delayLine.pushSample (1, input);
			delayLine.pushSample (0, input);

		delayLine.setDelay ((float) delayAmount);

		samplesOut[sample] = delayLine.popSample ((int) channel);

mixer.mixWetSamples (output);

It seems to work in general (I think) but Is it the correct way to do it?
Currently it sounds kinda uh…strange/glitchy for low delay values (say ~30 ms).

edit: Changed the code a bit and did some more tests.
I think it works correctly, but only for delay times >~15 ms, below that I get distorted sound.
Why is that?

that’s probably because around 20ms is the threshold at which a delayed signal can appear not to have been delayed. the haas effect also uses delays to up to 20ms for that reason. so maybe more of a perception thing than a bug

Thanks, but I don’t think that’s it.
If anyone wants to check:
I recorded a test signal with my plugin, listen to how the left channel sounds distorted, that starts when I’m setting delay times <~15 ms.

(as zip in case in case player doesn’t work:

Current source is here btw:

true. it is indeed weird that it only happens on one channel

Thanks for confirming.
Could well be that I’m doing something stupid in the code though. :smiley:

code looks reasonable to me, even though i don’t use delay line myself. you can change the delay time before or after the sample loop though since you are not smoothing that parameter. and maybe it’s also a bit confusing that there is an outter channel loop when you deal with both channels inside the sample loop anyway


If anyone comes across this in the future, it’s solved.

@danielrudrich PM’ed me (thanks!) and told I had to swap the inner and outer loops, so that the two delays (left and right channel) are ‘time synced’ on the sample level, which makes sense, now that I think of it.
It now does delay times >= 1 ms glitch free.

yeah. that’s what i meant too. most of the time you either use the channel loop or explicit channels in the sample loop. another example is mid/side conversion. stuff that only makes sense in stereo basically. make sure to check numChannels before going into this or it will crash on mono tracks