Cannot get the right sample rate in Final Cut Pro X


I’ve doing a real time plugin.
Because my process is optimized for 48 kHz, I need to use resamplers and to know the current sample rate.

Everything is working fine except with Final Cut Pro X (FCPX).

Let’s say I use a project at 44.1 kHz with an audio file having the same sample rate.
This doesn’t work. It seems that my resamplers don’t get the right sample rate. I tried different values of sample rate and see that final cut has an option called FrameRate which is set to 24.
I tried a sample rate of “24000” and it seems to work magically. The thing is whenever I tried to change the frame rate in FCPX, I’ve alway get fps24 with the following code in the processor:

auto playhead = getPlayHead();
AudioPlayHead::CurrentPositionInfo result;
std::cout << "FrameRate: " << result.frameRate << std::endl;

To be honest I’m a bit lost because I don’t have any experience in video editing software and I don’t know how to get the right sample rate in this situation.

Am I missing something?


Frame rate is the number of video frames per second, so should be unrelated to the audio sample rate. Most audio plug-ins don’t care about the video frame rate. What you want is the sampleRate argument passed to prepareToPlay.

Maybe FCPX adjusts the audio buffer size to some multiple or submultiple of (in this case) 1/24 second?

Sorry @DEADBEEF, forgot your mail. I am working now at a different company, so I don’t have FCPX on my current machine, but I can check at home.

First, @ujam is absolutely spot on, that the frame rate is referring to video frames (I know, in audio specification sometimes they refer to frames as well in terms of data packets, but that’s not the case here).

You get your sample rate like normal from the prepareToPlay call as usual, or later using AudioProcessor::getSampleRate(). It can be misleading like in all video editing software, that it is rendering internally with the project’s sample rate and only resample it for the output device. So changing the hardware sample rate might not have an effect (I don’t remember, if you set the project sample rate in the settings or only when exporting).

What also is worth to mention is, that the playhead starts at zero from the start of your clip, which is very helpful. But it can get in your way, if you are not aware of that. If you left trim your clip, the playhead will start with the according offset.

Thanks for your help guys!

@ujam you may be right it’s something with the audio buffer. I get the right sample rate, 48 kHz, but an audio buffer size (samplePerBlock) of 1156 ! Is it normal? Usually I get something like a sub-/multiple of 512.

@daniel thanks for the precisions. The project sample rate is set in the settings and at the exporting. Indeed during the export everything works fine. It’s only an issue during the playback…

Is there a way to change the audio buffer size? I think the issue may come from this parameter.

You have to be ready to deal with pretty much anything. The buffer could be 1 sample long, a prime number length, the length may change between the processBlock calls etc…If your audio processing somehow depends on being fed buffers of some particular constant length, you will need to add some additional buffering yourself. Plugins can not determine what buffer lengths are going to be used.

You should not rely on that parameter at all. It is a number to allow you to estimate how many resources to allocate before processing starts, but you should always use the buffer.getNumSamples() each processBlock() call, since this can vary between 0 and estimatedBufferSize as received in prepareToPlay.

EDIT: …and what @Xenakios said… :wink:

Yes @Xenakios and @daniel I totally agree with you, sorry it’s my fault I misspoke, I rely on the samplePerBlock of the processBlock to set the maximum size of my vector buffer I will use in my own processor.
What I meant was a way to change the audio buffer in FCPX like we could do in any daw, in order to be able to make some test and find where could come from my bug in my code. I may do something wrong with high audio buffer, never seen that since I testing only with sub-/multiple of 512

Sure, we all did that during prototyping, working on assumptions :wink:

I found 1156 quite common in video files, don’t know the reasons. It is not a 48000 / 44100 multiple, I checked that. I was caught when moving to windows, which often defaults to 480 instead of the 512. I think I even saw 441 once…

When DAWs allow that, there are still two different things:

  • Changing the buffer size of the driver (to reduce latency) or
  • Changing the internal buffer size (e.g. to get more precise automations)

Your pipeline doesn’t necessarily run with the same buffer size as the audio driver.
I wouldn’t bet that changing it in the AudioMidiSetup app changes anything…

TL;DR I am not aware that you can enforce anything, but my hunch is you can’t

I found a solution.

Here is what I found if someone would have the same issue in FCPX.

In most of DAW (I did my tests on the biggest one of the market), the samplePerBlock is always a sub-/multiple of 512. And this value doesn’t change in the processPerBlock

The thing was I used a custom buffer for my custom process. I allocated this one with 512 and sent it to the process method. Everything worked great.

In FCPX, I got a block size of 1156, as you guys mentioned it’s the maximum block size. But in the processBlock the buffer have a variable number of sample. Still inferior to 1156 but variable.
What happened was that my custom buffer was allocated with 1156, but had only 1156 minus something valid samples, the rest was only 0. So I processed the valid samples and the 0…

Moral of the history, watch out to your number of samples in the processBlock kids!

Thank you guys @ujam @daniel and @Xenakios you really helped me

yes, that is what I meant here, but I probably phrased it a bit complicated :wink: :