I am relatively new to JUCE and have been spending time with the API writing sample code and experimenting, which has been going well. One thing I’d like to do as a personal project to further explore the API is to write a standalone backing track player (not a plug in). Essentially this would be a multi-track audio player which could play up to 32 tracks of audio on individual channels. Additionally I’d like to send Midi Time Code (not Midi clock) so that the audio can be sync’d to external sources such as lighting desks.
All is fine so far and I have even written some code and have gotten audio out etc… However, you almost immediately hit a few snags with this type of thing and I have some questions related to sync.
As near as I can tell so far it is relatively easy to wire up some AudioSource’s and blast out the audio. However these sources are out of sync as they know nothing about each other or their positions in time relative to each other. Does the API support sync’ing multiple audio tracks together and if so what should I be looking at in the API as I have not been able to find anything that would appear to do this.
The next question is around Midi Time Code. I have read the MTC spec and its fairly straight forward in regard to the messages that need to be sent but once again we are back to timing.
The audio thread is locked to the sample rate of the audio card (e.g., 44100 samples). If I’ve done my math right that would be 44100 / 30 fps / 4 Midi Quarter Frames = 367.5 samples. Which means every 367.5 samples (how do you deal with a half sample - that’s a separate question) I need to be sending a Midi Quarter Frame message for sync.
Ok fine, but what if the user is using a large buffer size such has 8192 samples (which is very likely with an app like this for audio stability)? Every getNextAudioBlock is requesting 8192 samples which means I can’t get the 367.5 sample granularity. In other words by the time I get the next request from the audio card, 22.29 midi quarter frames (8192/367.5) have passed! How can reconcile these two clock rates to get a constant stream of Midi messages?
Sorry for the long post but I have been trying to think this through and either I don’t understand a key principle or am missing something. Looking for some feedback.