Clipping noise when running on iOS


#1

Hi,

I have been programming a tiny application on MacOSX for the past few weeks, reading from an AAC audio file stored in BinaryData.cpp and using a few customized filters (namely IIRFilter and Reverb) to do some effects.

The audio sounds fine on MacOSX now. However when I tried to run my app on iOS (either on a device or in the simulator), the audio sounds saturated.

At first I thought it might be due to CPU usage but when I ran the app for profiling it wasn’t using more than 10% CPU at any time.

On the other hand, if I apply a gain of 2 on the files I’m reading, the noise gets much worse, even with all filters disabled. So this strongly advocates for a saturation problem.

Did I miss something obvious? I thought maybe my floating-point buffers are converted into 16-bits integers or something?


#2

Are you sure it’s a saturation problem. Try adding a strong negative gain to see if the distortion goes away. If it doesn’t then it’s more likely that iOS is giving you varying buffer sizes and you may not be filling the buffers completely (i.e. leaving a bit of zeros at the end of every buffer). This also creates similar sounding distortion.


#3

I think I solved the problem, but I’m not sure why I had to do this.

I applied, like you suggested, a negative gain of 0.5 on the buffers I obtained from AudioFormatReader. There was far less clipping but still issues apparently coming from my low-pass filter. It doesn’t seem to behave well if I set a cut-off frequency of exactly sampleRate * 0.5. I set it to always stay below sampleRate * 0.49 and everything went back into order.

The issues with the low-pass filter happened also in MacOSX but, for some reason, less often. The clipping issue was entirely specific to iOS.

Could this be related to the fact that on my tests, iOS uses a bit-depth of 16, whereas MacOSX has a bit-depth of 24?


#4

I doubt it. Seems to me that it might be more likely that there is an issue with your filter such as some uninitialised variables.