Confused about time with MidiBuffer


#1

I was wondering if someone could clarify how time is represented in the MidiBuffer class?

int getFirstEventTime () const throw ()
Returns the sample number of the first event in the buffer.

If I am making a Midi only application that does not use an Audio Device, what do I relate the “sample number” to? Also, in my app I am trying to record streams from separate Midi Devices, but they all relate to a specific performer (two ppl are hooked up to the same computer and I need to save their Midi Streams as Performer 1[device A, device B] and performer 2[device C, device F]). The midi data is predominantly CC information, which makes me think the connivence functions in MidiMessageSequence wouldn’t be right for this, and that I should stay with plain old MidiBuffer.

Up until now I’ve been packing the Midi Data into an XML file. Each Child Element contains Device_Name, CC_Number or Note_Number, Value, and TimeStamp. Since the TimeStamps seem to deal with milliseconds, I’m unsure how to handle the packed MidiBuffer for time using sample numbers?

I am doing some machine learning on the MidiData, so I need ways of evaluating time and values, and also grouping relevant streams together.

Thanks for any clarifications or thoughts


#2

I think the point is that the only accurate (i.e. accurate enough to make music with) source of timing you have on a computer is the AudioDevice.
With your audio device, you know the sample rate and the buffer size.

The buffer size allows you to increment the total number of samples elapsed since you started playback. The sample rate allows you to convert between a sample number and an acutal time in seconds.

That’s - I think - the way it’s supposed to work.

I have absolutely no idea how - if ? - you can access to midi functionality without an audio device, but I think if you want to do that, as your post suggests, you’re on your own.

HTH


#3

Your post is a bit misleading there… there are plenty of ways of sending and receiving midi without using an audio device.

But it’s true that the MidiBuffer’s main purpose is when you do have an audio device (or are in a plugin), so you have a buffer of audio samples, and need to create a set of midi messages that are synced up with some of those samples. So the times of events in a MidiBuffer are stored as integer sample numbers.


#4

[quote=“jules”]Your post is a bit misleading there… there are plenty of ways of sending and receiving midi without using an audio device.

But it’s true that the MidiBuffer’s main purpose is when you do have an audio device (or are in a plugin), so you have a buffer of audio samples, and need to create a set of midi messages that are synced up with some of those samples. So the times of events in a MidiBuffer are stored as integer sample numbers.[/quote]

I just saw that MIDIInput and MIDIOutput don’t need an AudioDevice. So yes, OP, please forget that part of my post


#5

Thanks for clearing that up a bit. It would make sense that MidiBuffer would parallel behavior found in the AudioBuffer, but like I mentioned in the earlier post this app only needs midi so I haven’t enabled any audio devices as of yet.

I’m sitting here contemplating doing a major re-write this weekend on how I’ve approached the problem so far. Basically I want to make a MIDI sequencer that can record from different MidiInput Devices, sync to an external sequencer, and then play back my Machine Learning generated Midi Data, sending it to various Midi Outputs. My major trouble has been how to record/save the incoming CC messages so I can create analysis files to train the ML algorithms. I got confused as MidiBuffer seemed to be a good container to hold the values, but the time information became samples instead of milliseconds, and as has been pointed out, without an Audio Device I was unsure what to reference that sample number against to turn it into a time value.

At this point I’m packing and unpacking various XML files to tag and hold the data. I had wanted to use collections of classes, but I wasn’t clear on the easiest way to serialize the data and write it to disk? XML seems to work, but also seems cumbersome, and I feel like MidiMessage already contains pretty much everything I need. I’m looking into packing the midi data into MidiMessageSequencers and then writing to disk as a Midifile… although i’m having trouble with that at the moment, I’m sure I’ll work it out in a bit.

Basically, lots to learn and figure out. Does anyone know of any open-source JUCE midi sequencers I might be able to gleam some info from?