On another thread I asked about MidiOutput::sendBlockOfMessages and the value to use for millisecondCounterToStartAt (Question: MidiOutput::sendBlockOfMessages and time value). But this thread has confused me even more.
But it does affect the timing of the outgoing midi messages. In that sense it’s relevant to the question whether one should set millisecondCounterToStartAt using Time::getMillisecondCounter() or Time::getMillisecondCounterHiRes().
It seems that on Windows things are quite complicated, but on both Android and Mac Time::getMillisecondCounter() and Time::getMillisecondCounterHiRes() are obtained internally in similar ways (by calls to mach_absolute_time() on Mac, and clock_gettime (CLOCK_MONOTONIC, &t) on Android). So, in terms of precision, it seems to me that it shouldn’t really make a difference which is used, at least on those platforms. Am I right?
The problem I see though is that Time::getMillisecondCounter() is a 32-bit return value and will wrap back to 0 after 2^32 milliseconds of uptime. These are about 49 days. Since internally MidiOutput uses this low-resolution timer, shouldn’t we have to set millisecondCounterToStartAt using the same timer not to run into a possible discrepancy? 49 days of uptime doesn’t seem very unprobable!
The documentation of MidiOutput::sendBlockOfMessages seems to suggest using the low-res timer. But then again, there is an example within the Juce code itself where the hi-res timer is used, in juce_AudioProcessorPlayer, line 332:
midiOutput->sendBlockOfMessages (incomingMidi, Time::getMillisecondCounterHiRes(), sampleRate);
It would be great if someone could clear up these questions!
