Since it seems @TheVinn is no longer active, I thought I’d try my luck and posting here since presumably many here have a lot of experience with DSPFilters.
I’m having some issues with implementing high pass and low pass filters into my most recent plugin.
The firs tissue is that it appears the High Pass Filters from Dsp::SimpleFilter appear to not work consistently across different sample rates. I have checked and double checked that every time the DAW’s samplerate changes, I call .setup() on the filter with the new sampleRate as a parameter. It still seems to be the case that the slope on the high pass filter becomes considerably less steep when you transition from say 48000 to 96000, letting probably far too much sound below the cutoff through on the higher sample rates (either that, or the cutoff frequency is being inexplicably lowered).
My second issue concerns changing the cutoff frequency for any of the main filter types (using butterworth currently) in real time. At the moment I’ve been having to call .setup() every process block while the cutoff frequency is changing to change the filter’s cutoff, but this introduces noise (not zipper noise exactly, but some kind of other distortion). Even if I interpolate the changes to the cutoff frequency variable (e.g. by using the LinearSmootherValue class) it still introduces noise, is there a better way to smoothly change the cutoff frequency?
My last issue is regarding latency. I think these filter can introduce a few samples of delay, is it possible to get an exact measure of how many samples the filter is delaying the audio by, so I can use setLatencySamples() with it to notify the host?
Any help on these issues would be greatly appreciated.