In final cut I get in prepareToPlay always 44100 as sample rate, no matter, what I set in the project nor what I set in my system’s audio settings.
Also I only get one call, it might be, that it didn’t notify me with a new value.
Additionally I check now on various occasions with getSampleRate(), but that value also never changes.
Is there a chance, that the wrapper doesn’t propagate the value, or is it an Apple issue?
Thanks for your help!
Daniel
BTW, the isNonRealtime is also never correct in Final Cut. It always is set to realtime, even for the instances that render the waveform in the background (I can see the buffer underruns from my BufferingAudioSource, hence I run in non-realtime all the way). I hoped the fix for @fabian did for Studio One would fix that here as well, but it didn’t. Maybe that goes back to the same cause…?
Wow, in processBlock you can’t detect realtime/export… Sounds pretty bad. And if you can never know what the sample rate is… can’t think of many useful things your plugin would do, where you wouldn’t need those 2 parameters working and detectable at least on processBlock level.
I currently rely on prepareToPlay only for getting the correct Sample Rate, and I only detect the switch between realtime and export within the processBlock. But I had to completely change strategy - I don’t have 2 mods of rendering anymore. And my processing engine expects that every processBlock can be either realtime or export, so it blocks for rendering to finish (or doesn’t), accordingly.
Ohhh FCPX… I wasn’t aware of these issues with FCPX yet. Audacity has/had the same issue that it always reports a sample rate of 44100 Hz. Sorry Daniel, can’t help any further.
Hmmm just tested this but I can’t seem to reproduce this issue. I’m getting a prepareToPlay call with the correct sample rate. However, FCPX does not seem to be sending any offline-render notifications.
Thanks for looking into that.
I think I understand now. It doesn’t call the sample rate of the host, but rather the sample rate of the clip the effect is applied on. That makes sense, as the result is resampled afterwards to suit the project.
Now I only need to solve, why changes to my state are not propagated to the background rendering tasks… (or is that another PEBCAK here?)
I see lots of saveState before loadState and the results are not the expected ones… needs probably more investigation…
So this also triggers a call to AudioProcessor::getStateInformation (MemoryBlock& destData). And other hosts like Logic will read everything when save is triggered by the user.
However, FinalCut doesn’t save on the users, but constantly in the background. And also it has to synchronise the state with several parallel running background threads/instances. Therefore it only updates the parts, that are marked as dirty.
So I assume, I have to mark the effectState property as dirty as well, but there is a long enum that I can send as invalid. Does anybody (especially @fabian) have an idea, which that could be?
Or maybe there is another way to trigger an update of the effectState?