Midi sequencing: relative/absolute timing thoughts

Hello again.

I’m thinking aloud so you can correct my way of thinking.

This week’s thingy at the back of my head revolves around midi sequencing.

I’m still building a standalone basic little drum machine, but I thought, after looking through some of the MIDI related classes, that I should set up a little MidiMessageSequence for each channel. This way, if I wish, I can make it drive something else, like a VSTi. I have managed to set up a basic Synthesiser and stuff its renderBlock(MidiBuffer) with events from a MidiMessageSequence. The math wasn’t that hard to figure out (go mum!).

It appears that with MIDI sequencing, you have the absolute option (SMPTE/seconds) or relative (bars/beats… PPQ, I suppose?). I can have either in a MidiFile, but when using MidiMessageSequence, I need to use absolute time. And when sticking things in a MidiBuffer, I need to use a timestamp in samples from the start of the block. Lots to think about :shock: . I understand the need for the sample offset; it’s less… floaty… than a second based timestamp.

For my application, a relative event time would seem to make more sense (e.g.4/4 time signature, kicks on whole beats, hats on halves, repeat ad infinitum), but that gate is out (or seemingly so) in Juce. If I use absolute time, I would have to rewrite the TimeStamp of every MidiMessage in the MidiMessageSequencer each time I change tempo. Maybe that’s the correct way to go. Seems a bit weird.

My current idea is to abuse absolute timing: pretend the loop runs at 60bpm (kicks on whole seconds, hats on half seconds), and then, when filling the MidiBuffer, to distort the timing (e.g. PlayHead.tempo / 60bpm). This way I just shift the time of the event as it comes out of the MidiMessageSequence instead of rewriting the timestamps every time I change tempo. This way, for song position, I could just count beats as seconds, too!

Is my thinking right? wrong? am I setting myself up for future misery? Suitable for basic applications but nonsensical for larger DAWish projects?

DISCUSS!

For your sequencing you might want to use ticks as a base for your time calculations, for example 96 ticks per quarter (beat).

Whenever you do a tempo change, you recalculate the milliseconds or frames needed for a tick. When you fill the audiobuffer, you search your sequence to find all events with a tick position between the buffer start and end.
In order to set the midibuffers timestamp you convert this tick position to frames.

I found some old notes and like to paste it here so it will not get lost.
It might be interesting to get a feeling for the relation between ticks, milliseconds and frames.

Example
SampleRate = 44100
Tempo = 120 bpm = 2 beats per second = 500 ms per quarter note
A samplebuffer with 512 frames has a duration of 11.61 ms.

96 PPQN clock ( simple sequencer)

500 / 96 ticks = 5.20833333 ms per tick
at 44.1 samples per ms = 229.6875 frames per tick
per sample buffer with 512 frames ca 2 ticks ( 2.22911565 )

24 PPQN clock (Midi)

500 / 24 = 20.8333333 ms per tick
at 44.1 samples per ms = 918.749999 frames per tick
per sample buffer with 512 frames ca 0.5 ticks ( 0.557278912 )

Hope this is useful and correct. :wink:

Thanks, Frank!

The maths you have popped in seem to make sense, so that’s very helpful, but I’m a little confused by the “tick” thing. When I look through the API documentation, the MidiMessageSequence, there is no reference to ticks, only in the MidiFile class. So, I think I have misunderstood what I am supposed to do with the classes.

Would you help me clarify my understanding?

When I posted, I assumed that MidiMessageSequence was the coverall for sequencing events, that for programming something on the fly, I would use one for internal use and also to throw at VSTs.

Now, with a little more reading, and your post, I feel that a MidiMessageSequence is to be used to pull apart / assemble MidiFile instances, and to keep them away from MidiBuffers.

At the moment, I have no interest in reading/writing *.midi, so am I correct in saying that I should leave MidiFile and MidiMessageSequence alone (for the time being), and come up with MyOwnFunkyMidiEventStoringClass, which would store a bunch of MidiMessages in an array of some sort, and then, when the time is right, send them out to the appropriate MidiBuffers?

To be honest, I am doing a dodgy version of this now, I have an array of 16 booleans for each channel stating whether it should trigger or not (PPQN == 4 :wink: ).
I just saw all these important looking classes that Jules had implemented and thought I should be using those instead (so many times I have coded things up only to have a colleague point out a class/function that does exactly the same thing, but 100x better).

Oh, and thanks again for the mac port of my drum machine!

One of the main problems in building a sequencer is the conversion of relative time units in absolute units. My approach is to store events in a class of my own, using ticks (PPQ) as a relative, time and samplerate independent resolution. But there seems to be no obstacle to just use the juce MidiMessage, because you are free to fill the double based timestamp with whatever units you like.

When you get the audiobuffer of the streaming system, you have to convert the relative ticks into frames for further processing. You did that already in your drummachine, but used a fixed delta between notes. Obviously in an enhanced sequencer you would have not only 16th notes. And to make it more complicated, time (BPM) changes could occur any time.

Happy programming, I’m nosy to see your next code!

Frank, thanks again, I feel like I have a better idea as what to do next. A custom class, with custom events bearing a tick based timestamp. Then I’ll turn them in to MidiMessages when it’s the right time (oooh, baby, toniiiiight).

This “fixed delta” bit might take a little more to get my head around, but I will jump off that bridge when I get to it.

I’m interested to see how it turns out too! I will keep you posted.