Assuming your code is using the Juce AudioAppComponent, the setAudioChannels call starts the playback. Do you have that in the constructor before you have created your player objects?
It appears that something in the boilerplate generated by Projucer causes prepareToPlay() to run. I placed debug messages before and after the boilerplate. The one before runs. Then prepareToPlay() runs. So, yes, the constructor is running before prepareToPlay(), as expected.
It appears that I have to initialize my audio components before this boilerplate. It’d be interesting to understand which part of it causes this issue. Here it is:
// Some platforms require permissions to open input channels so request that here
if (juce::RuntimePermissions::isRequired (juce::RuntimePermissions::recordAudio)
&& ! juce::RuntimePermissions::isGranted (juce::RuntimePermissions::recordAudio))
{
juce::RuntimePermissions::request (juce::RuntimePermissions::recordAudio,
[&] (bool granted) { setAudioChannels (granted ? 2 : 0, 2); });
}
else
{
setAudioChannels (0, 2);
}
I’ll continue to play with this and see if moving everything above that code makes the application work.
I am not familiar enough with JUCE yet to understand just how much of this boilerplate I actually need.
prepareToPlay can even run again in the middle of an instance’ lifetime. for example if the sampleRate changes. so you have to design your gui to react to those changes
Yes, I’ve been moving code around trying to understand. I think I am starting to get the idea. That bit of Projucer code that somehow triggers prepareToPlay() confused me. I am just accepting it for what it is and doing all my initialization before it.
The general point is that in a DAW host, your MainComponent may never be created. If the user loads a project containing your plugin, but the user never clicks to open the UI, the host will probably just not call createEditor().
So you need to write your plugin in such a way that it runs the same without an editor as it does with one.