I have a newbie question about using background or feeder threads in a Juce Audio Plugin Framework-based plugin.
I currently have some virtual instrument-like code in a standalone application. The implementation includes threaded streaming of audio data from disk and some other threaded computation. I would like to port the implementation to work as an audio plugin. Juce and the JAP Framework look amazing, so I am looking at it for this purpose.
I am not a programming newbie, but am an audio plugin programming newbie. I’ve looked at Apple developer documentation to get acquainted with AUs. And I’ve looked at Juce and the JAP Framework and think I understand how to use it.
However, it’s hard for me to see if it’s appropriate for an audio plugin to manage threads (for disk streaming and/or computation). And if it is appropriate, where and when they would be managed (AudioFilterBase::prepareToPlay and AudioFilterBase::releaseResources look like likely candidates).
Can any Juce and JAP Framework experts provide guidance on this? It’s greatly appreciated!