Hi all,
I’m running into an issue with a plug-in I’m developing. The TL;DR is that the plug-in runs some volume detection on the incoming audio, and will enable an effect when the incoming volume reaches a certain threshold. The plug-in involves a looper/delay so the incoming audio will get copied to a buffer.
As part of testing, I checked the behavior of my plug-in when changing the buffer size in my DAW and in standalone mode. I found that it mostly responded fine, but occasionally there would be a burst of digital noise that would trigger the volume detection and get recorded. Worse yet, it seems that occasionally something nasty like a NaN gets recorded, which I believe will cause a DAW like Ableton to shut down output from your plug-in/channel if it isn’t caught.
I’m trying to wrap my head around how to safeguard against this. One idea I had was to set a flag in prepareToPlay which would disable any volume detection in the following processBlock, but I’ve read that some DAWs (like FL) will have completely flexible buffer sizes, so I imagine that prepareToPlay will be getting called all over the place and therefore volume detection would ~never happen.
The safest/simplest option that I can think of would be to do more or less nothing other than to make sure nothing nasty like NaN gets recorded to my audio buffer. I imagine that a DAW with flexible buffer sizes won’t be outputting any noise when the buffer size changes, as opposed to a fixed buffer size DAW (e.g. Ableton) which has generally unpredictable behavior when you start changing the buffer size in the middle of a project.
Do you clear the buffer for that in prepareToPlay? For example the Juce AudioBuffer does not clear the buffer with zeros by default.
I do make sure to clear when necessary and especially on initialization. I basically do the following (hopefully you can verify whether or not this is completely stupid):
-
I have a global flag “initialized” that I set once the first prepareToPlay is called. On the first prepareToPlay call, I initialize my AudioBuffers to their proper sizes (i.e. the maximum size to expect based on the current sampleRate) and clear them.
-
If prepareToPlay is called, but the sampleRate and channel config hasn’t changed, I skip doing anything. This is because I found that prepareToPlay could get called any number of times during a plug-in’s lifecycle, and it was clearing out my buffers every time it got called. Most pertinently, I was finding for DAW’s with flexible buffer sizes, this was getting called unpredictably and killing stored audio if I clear()'d buffers every time prepareToPlay was called.
-
In processBlock, I check for the current size of the incoming audio buffer and adjust accordingly. For example, updating a samplesToCopy var to match the number of samples in the incoming audio buffer.
All DAWs can provide a varying sample count to processBlock. The prepareToPlay method is only called when sample rate or block size is changed in the DAW settings or on initial loading etc. Generally when not playing.
In processBlock you need to zero the input Audio buffer channels from the number of samples given to the end of the buffer. And you only need to process the number of samples given.
In processBlock, I check for the current size of the incoming audio buffer and adjust accordingly.
It does not sound like real-time safe. Is it possible to put buffer-resizing part in prepareToPlay()? AFAIK prepareToPlay should not be called very often.
It does not sound like real-time safe. Is it possible to put buffer-resizing part in prepareToPlay()?
Yeah, this occurred to me after posting, so I just made that change
It hadn’t yielded any issues that myself or any of my users had found/complained about, but better safe than sorry.
AFAIK prepareToPlay should not be called very often.
I guess my main question would be: what happens in a DAW where the buffer size isn’t consistent (e.g. FL)? In testing, I found that the buffer size in the iOS “DAW” AUM could be inconsistent, especially on startup, and prepareToPlay was called on each change. This makes me wary to have any special logic if prepareToPlay was just called, because I’m worried it’ll get called more or less every block.
In processBlock you need to zero the input Audio buffer channels from the number of samples given to the end of the buffer. And you only need to process the number of samples given.
Could you explain this a little more, please? Are you saying that samplesPerBlock received in prepareToPlay can exceed the number of samples in the incoming buffer in processBlock?
I guess my main question would be: what happens in a DAW where the buffer size isn’t consistent (e.g. FL)?
The buffer size can be flexible. However, we would assume that maximumBlockSize is relatively stable (buffer size should always <= maximumBlockSize).
because I’m worried it’ll get called more or less every block
AFIAK this will not happen. Otherwise lots of plugin won’t work properly.
The block size in prepareToPlay is given so that you can allocate buffers etc. It’s not called that often.
The number of samples given in processBlock is the number of samples the host has decided to process. It won’t be larger than the block size from prepareToPlay.
Point is, in processBlock the sample count may be smaller than the total samples of the given AudioBuffer so you have to zero the difference. It may be filled with rubbish as the DAW handles the memory it’s mapped to.
Ahhh okay, thank you and thanks @Nitsuj70, that makes sense! I was trying to intuit the behavior from limited experience (that AUM case was the first time I encountered this) and it looks like my understanding was off.
In that case, it sounds like it might be relatively safe to abstain from doing any recording in the block following a prepareToPlay.