Getting the "Age" of a midiMessage

Hello everyone,

I’m trying to figure out how I can calculate how long it’s been since I recieved a midimesseage. Since the docs say it’s application specific, I’ve tried a couple of things. On a mac, the timestamp unit seems to be seconds, on winXP it seems to be in Milliseconds. Calculating a simple interval (on XP) like Time::getMillisecondCounterHiRes() - midiTimeStamp seems to give way to large results…

Can anyone tell me what timers to use in this calculation?

Also: I propose something like MidiMessage::getAgeInMilliSeconds() if that wouldn’t create to much overhead…

It’s application-specific because obviously messages are used for all sorts of purposes, and it’s not necessarily only used as a real-time stamp, it could be the position in a midi file, etc.

But if you’re getting the message from a MidiInput, then have a look at the help for MidiInputCallback::handleIncomingMidiMessage for details on how the input creates the timestamps.

hmm yes that’s very clear thank you!
I was forgetting about the fact that a midimessage doesn’t have to originate from a midi input…

okay, so i can calculate how “old” a message is by doing:

double delta = (Time::getMillisecondCounter() * 0.001) - timeStamp;

This works very well, but sometimes i get ridiculously large delta times (in the order of a few hundred seconds). Could this be caused by the midi input thread running on a different core than the thread i’m calling getMillisecondCounter from?

At the moment i’m running this on a i7 CPU ( 8 cores ) with windows XP.

Interesting… different cores could certainly skew it a bit, but if the difference is hundreds of seconds, it sounds more like an overflow or something.

I doubt it’s due to an overflow, I get this behavour with consecutive midi messages… I’m syncing audio to external midi clock. The time between the received midi start and the actual first audio callback tells me how many samples I have to offset the audio in order to stay in sync. Normally these offsets would be in the < 10 mS range, depending on buffersize and SR off course… This works most of the time, but once its start returning these large numbers it will keep at it. I’ll test it on mac tommorow, doubt i’ll see any strange behaviour there.

Weird. Can’t think of any suggestions, but keep me posted if you find out any more clues.