Which one should be used: Samples or milliseconds?

Hi all,

In my application I am doing some of the computations by using samples. But some places I have used milliseconds value for calculation. I want to ask a question: what should I use as a normalization, samples or milliseconds? which one is good?

Thanks for your time!

Well it is usual to specify things using time as this will be independent of the sample rate. The number of samples required to do some processing will depend on the current sample rate so you either need to pass this to your function or keep an internal value of the current sample rate up to date.

It depends on what your specific application is as to which method you should use.

Thanks for your reply!

So does it mean that when user interact with the application we can use the time and for internal computation or wherever I am dealing with the track and its sample rate I can use samples?

Here I want to normalize my usage so that I can use only one unit to avoid confusion.

Please forgive me if I am doing any mistake.


Well it is easier to use time if you can and certainly from a UI point of view. For example, if your are creating a delay class you would want to specify the delay time in ms something like delay.setDelayTimeMs(1000). However, a delay time of 1000ms is meaningless without a sample rate. This equates to a delay of 44100 samples at a sample rate of 44100 and a delay of 96000 samples at a sample rate of 96000.

You cant assume a specific sample rate as this will change depending on the users hardware. It is probably most straightforward if you specify the sample rate when you change the time something like delay.setDelayTime(getSampleRate(), 1000). This saves you from forgetting to update the sample rate when it changes (usually in a prepareToPlay method).

I think the most straightforward way would be to append what unit you are setting to the method name eg. setDelayTimeMs(…, …) or setDelaySamples(…). This means you shouldn’t call the wrong method by accident.

By the way I am assuming you are dealing with some audio processing here (please correct me if I am wrong). If you can give a more detailed description of your application we could probably tell you more.

Thanks for your reply.
Yes, my application is doing some audio processing.
And now I am using the method names including the data format type i.e. the samples or milliseconds.
Here I am trying to record the previously loaded track and I wanted to add some delay compansation value in the recording so that the recording will be in sync with the orignal track. But every time I am getting different value for different test.
Can you help me to find out the sample rate of the current device I am using.

Thanks for your time!

Its AudioIODevice::getCurrentSampleRate() so you may need to get the current audio device from the manager first using AudioDeviceManager::getCurrentAudioDevice().

It is my understanding that if the sample rate changes the audioDeviceAboutToStart(AudioIODevice *device) callback is called so you can use the device passed to that method to get the sample rate about to be used and update your delay object if necessary. This is probably the efficient methods.