applyToBuffer() method of a
tracktion_engin::plugin, I’m trying to identify the source of an incoming
MidiMessage. I’m doing this by using the
MPESourceID term inside the
MidiMessageWithSource struct, like this:
void MidiTimerPlugin::applyToBuffer(const tracktion_engine::AudioRenderContext& a)
for (auto& m : *a.bufferForMidiMessages)
auto source = m.mpeSourceID; // <-----
if (source == thisSource)
else if (source == thatSource)
As far as I can tell, the
mpeSourceID term originates from the
MidiClip. My question is: is it possible to find this ID number from the
MidiClip itself? I’ve been diving through the
MidiClip methods, but I can’t seem to find it. What I want to do is this:
auto clipID = midiClip.getMpeSourceID();.
Is it possible?
Can I ask why? There might be a better way of doing what you’re asking.
mpeSourceID doesn’t really relate to the clip, it’s created dynamically from the MidiAudioNode that is used to play it back:
MidiMessageArray::MPESourceID midiSourceID = MidiMessageArray::createUniqueMPESourceID();
It’s really only supposed to be used to merge MPE MIDI streams so the channels can be allocated correctly.
It relates back to this conversation that we were having a few weeks ago. Basically, I’m trying to cue internal events on the timeline next to midi notes (“internal events” being events which will be interpreted on the message thread). To achieve this, I’ve got two
MidiClips inside one
AudioTrack–one for internal and one for external. On the audio thread, I receive a
MidiMessageWithSource, and then determine which track it came from using the
mpeSourceID term, as described above. It all works, but I’m having to use magic numbers to match the source ID right now, and I’d be much happier if I could extract them from the
If you can think of a better option then please let me know, but it seems to me that
mpeSourceID is the right tool for the job. It’s exposed in the
AudioRenderContext already, so why not use it?
mpeSourceID is really in implementation detail and I can’t guarantee it won’t change either in availability or behaviour so I don’t really want to expose it.
Can’t you use something like MIDI channel to determine the source?
I had thought about using midi channels or something, but everything I come up with would restrict user’s access to midi, so I’d prefer not to. I’m still open to suggestions though.
mpeSourceID is already exposed inside
But what you’re suggesting isn’t really anything to do with MPE. Like I said, if we change the MPE implementation it would break all the work you’ve done.
It’s also not the right tool for the job anyway. It’s not static and can change at any time, it really can only be relied upon to be different for each different source, not to uniquely identify the source.
It’s difficult to make recommendations without fully understanding the requirements and use cases of the app you’re designing. Are you outputting audio or MIDI though as it sounds like you should just be using the track to differentiate events.
For example, if you want to trigger an event in the future, you’d just insert that in the track, in a MIDI clip for the track you want to respond to. Then it’s up to the plugins on that track to figure out what to do with that message. Presumably you’d use CC messages or program changes if you need custom behaviour?
I think I understand what you’re saying: instead of trying to pick apart events from different
clips, I should be using different
tracks for different types of events. Is that correct? If so, it makes a lot of sense!
I would have thought so…
I really depends on what you’re doing in response to these events Usually they’d be processed by a plugin or send to a MIDI output device. These are all track-specific things so I’d process them there.
Thanks for the feedback! I’m learning all this as I go, so I’m still a bit vague on the differences between tracks, clips, nodes, etc. Thanks for bearing with me through it.
To clarify the design: right now, I’m just trying to build a sequencer that can cue messages and send them to one of two places: either “out” as midi messages to other programs, on “in” as commands to the program itself. Later, once this stage is complete, there will be a few more options: I’ll have samples that can be triggered, and hopefully VST units also. Although I’m not building the sampling or VST components right now, I want to make sure that my first stage prototype is not incompatible with them, which is why I’m using Tracktion Engine, rather than just simple timers.
For now, it seems that having a separate
tracks for midi and internal messages is the way to go. Later I can hopefully just add another track for sample playback. VST… well, I’ll cross that bridge when I come to it.