I need to check if my plugin is being rendered offline for export. This check is on a sampler voice level, where I don’t think I should have a reference to the PluginProcessor.
Of course I can just set a field on a custom static state object (through SharedResourcePointer) inside of PluginProcessor::prepareToPlay () and read that from my sampler’s voices, but I wonder - why isn’t this a static member of AudioProcessor? Or otherwise said - is there a situation where one PluginProcessorA instance is set to nonRealtime mode and another PluginProcessorA instance isn’t?
I guess JUCErs would be more knowledgeable. but from my experience:
At least with Obj-C I know that loading DLL means it’ll stay in memory until the host exit. (since there is no way to unload dylib with Obj-C according to Apple’s documentation = cocoa will always stay until app is shutdown)
Anything static initialize once and pretty much stays until you re-open your DAW.
So a plug-in could have non-realtime and realtime instances. but the static would be called for both from the loaded dylib memory portion with whatever it was set when init.
I don’t know if there is a host that works that way, but hypothetically a host could run processors within the monitoring cues as realtime, while those tracks with material from disk and automation set to read could be preprocessed as non-realtime.
It’s a property that’s specific to different plugin instances. Many hosts will do things like having a background thread render one session while another one is playing. So there will definitely be times when one of your plugin instances is realtime and another isn’t.