I have a delay module, but modulating delay-time up sounds kind of harsh.
modulating time down gives the carateristic increasing-pitch kind of sound.
The delay is a fractional delay line, where I read from it after [delay_time_in_samples] samples.
Ok this seems not very surprising to me. When modulating up fastly, the read position will increase faster as the delay is moving thereby “travelling back in time through sound” which results in absolute garbage.
Now I had a few ideas, none satisfyig:
- Smooth the delay time
- Limit the increase of delay time to 1 sample / sample (this shifts the sound an exact octave which feels superweird)
- I thought about counting the samples I jumped over and averaging them (I haven’t implemented this yet).
Is there a common approach to this problem? I tried to modulate the delay time on a commercial synth, and the result was still a bit shaky…
Thanks for listening to my TedTalk