I’m thinking aloud so you can correct my way of thinking.
This week’s thingy at the back of my head revolves around midi sequencing.
I’m still building a standalone basic little drum machine, but I thought, after looking through some of the MIDI related classes, that I should set up a little MidiMessageSequence for each channel. This way, if I wish, I can make it drive something else, like a VSTi. I have managed to set up a basic Synthesiser and stuff its renderBlock(MidiBuffer) with events from a MidiMessageSequence. The math wasn’t that hard to figure out (go mum!).
It appears that with MIDI sequencing, you have the absolute option (SMPTE/seconds) or relative (bars/beats… PPQ, I suppose?). I can have either in a MidiFile, but when using MidiMessageSequence, I need to use absolute time. And when sticking things in a MidiBuffer, I need to use a timestamp in samples from the start of the block. Lots to think about :shock: . I understand the need for the sample offset; it’s less… floaty… than a second based timestamp.
For my application, a relative event time would seem to make more sense (e.g.4/4 time signature, kicks on whole beats, hats on halves, repeat ad infinitum), but that gate is out (or seemingly so) in Juce. If I use absolute time, I would have to rewrite the TimeStamp of every MidiMessage in the MidiMessageSequencer each time I change tempo. Maybe that’s the correct way to go. Seems a bit weird.
My current idea is to abuse absolute timing: pretend the loop runs at 60bpm (kicks on whole seconds, hats on half seconds), and then, when filling the MidiBuffer, to distort the timing (e.g. PlayHead.tempo / 60bpm). This way I just shift the time of the event as it comes out of the MidiMessageSequence instead of rewriting the timestamps every time I change tempo. This way, for song position, I could just count beats as seconds, too!
Is my thinking right? wrong? am I setting myself up for future misery? Suitable for basic applications but nonsensical for larger DAWish projects?