False true latency reported on OSX

In the coreaudio audio code, you’re (just as it was the case for ASIO) using some obscure empirical formulas to compute the true I/O latencies, and it gets reported wrong.
Here’s a reference on how to do it correctly, apparently: http://lists.apple.com/archives/coreaudio-api/2010/Jan/msg00046.html . I haven’t tested it, but certainly will.

Furthermore, in the coremidi MIDI stuff, I’d certainly use p->timeStamp = 0 instead of p->timeStamp = AudioGetCurrentHostTime() as timestamp for the to-be-outgoing MIDI packets because what you wanna achieve is to send the MIDI out ASAP, and that’s exactly what timestamp = 0 stands for, referring to the Apple docs.

Ok, thanks, I’ll take a look at the latency stuff when I get chance.

No… If you just send each event “as soon as possible”, then if the device hardware works in terms of blocks, a group of events which should be slightly separated in time may end up being clumped together and sent out at the start of the next available block. Adding a timestamp gives the device enough timing info that it can hopefully avoid that problem, and minimise jitter.

I don’t agree. It would be true under different circumstances, but because this function is called by MidiOutput:sendMessageNow(), it should indeed output ASAP. It’s not like it is meant to be called on the audio thread either (plus in that case you’d not just timestamp with the current time but at some offset which is based on the sample position in the block)… It’s called via the MidiOutput::run() Thread.

And while we’re talking about timestamping: Since OSX does offer the possibility of timestamping the to-be-sent MIDI packets (and I’ve found that it is very inaccurate to try to send the MIDI out at “the right time” oneself via a Thread as JUCE does it), why not use that approach? With JUCE, I have strong Jitter issues on the MIDI output, whereas Logic Pro 9 seems to be able to send the MIDI out at accuracy that is absolutely brilliant. I suppose Logic uses CoreAudio/CoreMidi capabilities as much as possible, since it’s an Apple product, and the OS offers everything you need.

I’ve just added some code to JUCE so I can output directly to CoreMidi. How this works:

At the start of my audio callback, I init these 2 variables:

torigin = AudioGetCurrentHostTime(); // ticks mul = AudioGetHostClockFrequency() / samplerate; // ticks per sample

For each to-be sent out MIDI byte, the MIDI packet passed to CoreMidi will contain a timestamp that has:

I’m passing over the data to CoreMidi in the audio callback itself, seems to work without any problems so far. It seems to be more accurate than using a Thread + MidiOutput::sendMessageNow(). Ofcourse this only works on OSX.