Glitches with Logic Pro X automations


I’ve an issue with Logic Pro X (LPX) only.

I use valueTreeState parameters, I tried to use my parameters and automated them. Everything working great in the different DAW except for LPX.

As soon as I start to automate one parameter I’ve a lot of glitches and noise during the playback and everything goes back to the normal at the end of the automation.

This issue appears only with LPX, not in Ableton Live or other DAW.

Any guess? I’m a bit lost on it.


I’d suspect some kind of feedback loop, where the automation was itself triggering further parameter changes (or perhaps just clogging everything up with lots of messages).

Thanks for you insight but I’m not sure to really understand.

Do you mean it will be a bug inside LPX or my code? Since it’s only appear on LPX.

If it’s in my code, what could do this kind of loop?

On this parameters, I only setup the attachment in the editor and setup the the parameter states (addParameterListener, createAndAddParameter and so on) in the processor.
Then I’ve the parameterChanged where I get the value of this parameter like this:
const auto gain = *parameters_.getRawParameterValue(PARAMETERID::kMasterVolumeID); processorChain_.get<ProcessorIndex::kGain>().setGainLinear(gain);

I can’t see anything too suspicious in what you’ve told me. Is the rate of parameterChanged callbacks in Logic much higher than the other hosts?

Do you see the same behaviour in the much simpler APVTS example code?

I tried this example and it works without any glitches.

So I looked in my code and realized that the glitches disappears when I remove my processors.
But it’s kind of weird because the glitches are still here even with a simple resamplers.
And once again in other DAW like Ableton Live, there is no issue in the automation even with my processors enable…

For what I understand is that I cannot use my plugin on LPX with any dsp processors except very simple one like Gain.

Anyone encountered this issue?

I can report, that e.g. in my Frequalizer it works well with the IIR filters from the dsp module.

Is there a chance, that you confused the normalised and unnormalised versions of the parameters, so that you have unexpected large jumps? Filters can create ugly noises, if there are too drastic changes in the values like frequency or q factor.

Thanks for your insight @daniel
In fact I tried other combination and I found what really cause this issue.
It happens when I copy some data in the process (I know it’s not good to do it but regarding the structure of my custom dsp I need to do it).
So I copy the buffer into an other one, process, then copy the result in the process.
That why it happened when I resampled also.
Here is the stages of my process

if sampleRate != 48kHz
Resampling -> customDSP -> Resampling

if sampleRate == 48kHz

In each steps I do a data copy from one buffer to an other one.

So I guess I have to find an other way…

Does anyone have an experience in data copy in the dsp process?

Copying data in processBlock is no problem, in fact, it happens in many cases and is very quick. But you need to have the memory for that already allocated beforehand. Do an estimation, how much samples you will need worst case, and do the AudioBuffer::setSize() in your prepareToPlay.

For copying use the AudioBuffer::copyFrom or AudioBuffer::addFrom

Hope that helps

Thanks @daniel
Regarding your message I’m happy to see that I wasn’t totally wrong.
I do what you say, my memory is pre allocated in prepareToPlay. But I still have issue. For example here is my resampler processor:

The prepare method:

void Resampler::prepare(const dsp::ProcessSpec &spec) noexcept {
  // FYI the lagrangeInterpolator is defined like this:
  // std::vector<std::shared_ptr<LagrangeInterpolator>> lagrangeInterpolator_;
  channelCount_ = spec.numChannels;
  for (auto channel = 0; channel &lt; channelCount_; channel++) {

And the process method:

void Resampler::process(const Context& context) noexcept {
  if (context.isBypassed) { return; }

  const auto& inputBlock = context.getInputBlock();

  auto& outputBlock = context.getOutputBlock();
  const auto numOutSamples = static_cast<int>(outputBlock.getNumSamples());

  jassert(inputBlock.getNumChannels() == outputBlock.getNumChannels());

  for (int channel = 0; channel < channelCount_; ++channel) {
    auto inputData = inputBlock.getChannelPointer(channel);
    auto outputData = outputBlock.getChannelPointer(channel);



In LPX, if the automation is used, size of the buffer change randomly (but the value is still less than maximum sample size).
So I’ve some buffer filled with a lot of zero at the end and THAT’S WHAT IT GIVES some glitches.

Thanks all for your help. Now I’m gonna cry a little because it will give me a lot of work to fix it…

Always set buffer sizes in prepareToPlay() and use buffer.getNumSamples() in processBlock(), which can be the same size or less. Never use the samplesPerBlock from prepare() in processBlock()!

1 Like

I often heard the suggestion, but the first time that I see now a host actually doing it:

When the automation parameter is changing, it makes sense to have smaller buffers, so the parameter is updated more often. If there is no change, it makes sense to send bigger buffers to save CPU.


@peter-samplicity Yes, the prepare method (where the size of the buffer is set) is only called in prepareToPlay.

The issue comes from the varying buffer size in LPX.
The explanation of @daniel is really good and I think it’s what LPX actually does.

1 Like

For the one who encounter this issue, the solution I used was to put a ring buffer before doing my processes. That way I’m sure to get always a constant block size.