Real-time audio using processBlock()

I am getting a lot of distortion utilizing AudioProcessor's processBlock() function and was wondering if I could get some pointers. 

Currently my processBlock() function is empty such that the AudioSampleBuffer parameter will just "pass through" as I test the latency.  When timing the delta between calls of processBlock() the resulting latency times are greater than the latency times reported by AudioDeviceSelectorComponent's buffer size dropdown.

I am writing a stand-alone application that processes audio in real-time (or as close to real-time that I can get).  I have derived a class from AudioProcessor, instantiated an object of AudioProcessPlayer, and set an object of my AudioProcessor class using AudioProcessorPlayer's setProcessor().  I am able to successfully process audio data coming into the processBlock() function.

I am using a USB 2.0 audio interface set to a sample rate of 48 kHz and a buffer size of 64 samples.

Utilizing the AudioDeviceSelectorComponent, I can set the application's buffer size no lower than 144.  The dropdown for the 144 buffer size option displays a latency of 3 milliseconds.  At this setting the audio output is heavily distorted and rather bit-smashed.

Calculating the timing delta between calls of my empty processBlock() at the above setting yields a latency of 5.002 milliseconds, which is higher than what the AudioDeviceSelectorComponent displays for the 144 buffer size setting.  All of my timings show that buffer sizes from 144 to 288 (inclusive) have a greater latency than what is displayed in the AudioDeviceSelectorComponent buffer size drop down.

It isn't until I select a buffer setting of 320 or greater from within AudioDeviceSelectorComponent that the audio output sounds clear.  Also the timing delta between calls of processBlock() at these settings (320 or higher) results in a latency calculation that matches what is reported by AudioDeviceSelectorComponent.

Is it safe to assume the distortion and bit-smashing is occurring due to the timing discrepancies? If so, what could be causing the timing discrepancies at these lower buffer settings (144 - 288)? 

Thank you for any insight you can provide.

You can't just leave the processBlock method empty - you need to at least fill the output channels with zero - read the comment for that method!

Hi, jules.  Thank you for the quick response.  

I should note, my original code was processing the incoming audio stream and applying a low-pass filter which used Intel's IPP methods.  That is when I noticed the distortion / bit-smashing in the output for the lower buffer settings.  I then removed my filter code to see if the distortion was still present.

Reading the comment for that method, it states that a user should fill with zero (or clear) the output channels that are greater than the number of input channels.  In my processBlock(), AudioSampleBuffer::getNumChannels() is returning 2, which is the same as the number of input channels and output channels that I have open.  I would assume that in this case an empty processBlock() method would simply pass the audio through as if a filter was not enabled.

Do you know of any reason why the audio would be distorted at these lower buffer settings?

Thanks again for any help you can provide.


Do you know of any reason why the audio would be distorted at these lower buffer settings?

Is it just a cpu problem caused by using a buffer size that's too small for the hardware to keep up?