Microtonality

Since MidiMessage are always at least 4 bytes (the size of a pointer), I guess that fourth byte could be used to store additional information…

Well that would have to be an implementation optimisation for juce/choc etc.
I don’t think it solves the problem though as your sysex is larger than 4 bytes and as far as I’m aware there’s no standard for 4-byte MIDI 1.0 messages?

Yeah, I realise that it wouldn’t solve the general problem.

(It would however solve the problem for me personally because this way I could sneak in 8 bits of pitch deviation data into noteOn messages without anyone else even noticing and my plugins could check for that)

Also eagerly awaiting MIDI 2.0 adoption for microtonal support!

1 Like

Hi again,

Sorry for being a bit obstinate about all this but I really do like the Tracktion Engine so far while at the same time microtonality is a must-have for our use case… and I have this feeling that it must be possible to solve somehow without any major rewrites!

In the light of this thread and after some hours of digging through the code and testing various approaches, it seems to me like the issue is twofold:

  1. Microtonal pitches must have a representation in the Tracktion Engine model
  2. The pitches must then be communicated to synths and other plugins

Starting with the respresentation, the solution seems pretty straightforward: adding an additional field to MidiNote containing the “pitch deviation”. To me it makes sense to store this information in the note object as it is there is semantically belongs. Keeping an integer noteNumber and adding a float pitchDeviation also seems like a sensible compromise between flexibility and backward compatibility: all pitches could easily be represented but the extra pitch information could easily just be ignored by code that is not interested in it.

The second issue seems to be harder to solve within the current limits of the implementation. At first glance it seems like a trivial thing: I have a few bytes of information that I would like to pass on from a MidiNote to the plugin playing that note. The problem is that the “transport layer” consists of MIDI (1.0) messages so only information supported by MIDI 1.0 can be passed on. The built-in existing way of extending MIDI (SysEx) also cannot be used because of memory issues (allocation in the audio thread).

So a few questions as an attempt to move forward:

  • Can we agree on this description of the issue?
  • Do you agree that it makes sense to think about the two parts of the issue separately?
  • Do you think that my proposed solution for 1) is reasonable?
  • What options are there for 2)?

… and addressing the last question (what options do we have), here are my attempts so far…

  • SysEx tuning messages
    Pros: standardised, pretty straightforward, well suited for the task
    Cons: SysEx messages allocates when copied making the unsuitable for the audio thread. (This could be worked around e.g. by always reserving at least 14 bytes in juce::MidiMessage which wouldn’t be much of a memory/performance hit with today’s measures)

  • NRPN messages
    Pros: fits into the current implementation
    Cons: awkward to use (multiple messages per instruction), limited (7-bit) precision, not standardised so specific to each application.

  • Waiting for MIDI 2.0 adaption
    Pros: standardised, specifically made for purposes like this, not-a-hack
    Cons: still very limited support both on platforms and plugins, may take long time before concrete results

  • Storing extra information in existing MidiMessage
    Pros: simplistic, backward-compatible
    Cons: it’s a hack abusing the MidiMessage class, limited precision (there is only one extra byte available on 32-bit architectures)

  • Extending or subclassing MidiMessage
    Pros: flexible, simple
    Cons: requires changes to JUCE, sort of using the Wrong Tool for the job

1 Like

Curious why you are intent on putting microtonality into the engine? While it would be great it seems the state of play right now is (1) each plugin supports microtonality using either scl/kbm or tun (with various degrees of accuracy) or (2) the two ODDSound patterns (have inter plug comms or map midi to mpe streams). I may not understand your use case well but doing the midi modification in engine seems like quite a lift so would love to know why you drew that conclusion! (Alas very few plugs support mts as sysex. Some hw does)

1 Like

It seems to me like most plugins supporting microtonality are directed towards scales or temperaments in one way or another, mapping some input notes to frequencies using some kind of tuning system.

In our use case we have symbolic input (notes) where each individual note may have a specific intonation, different from other notes with the same “nominal pitch” (note number). The source of the notes may be transcriptions from audio, automatic or manual, where intonation fluctuates. Or notes may have been edited by the user to alter the pitch slightly up or down. Etc.

So for our software, intonation is not just a “presentational layer” but an intrinsic quality of every note. Therefore it would make sense (to me at least) to carry that information over to the audio engine rather than “emulating” it using pitch bend or other hacks. (Of course in the end these hacks may be just how the plugins are eventually controlled, but IMO that should preferably be abstracted away and handled by the engine.)

Hope this answers your question!

1 Like

It seems sensible to me to increase the size allocated to MIDI events, such that short SYSEX messages like real-time-tuning can be supported without the need to allocate on the real-time thread.
My experience is that in comparison to the CPU and memory overhead of audio-processing, MIDI tends be essentially ‘free’.

2 Likes

Oh interesting. So when you say c#5 that pitch is also scored. Kind of like you have a fully specified 127note scale which is time dependent. Yes that is a unique case.

You could do that if you wrote your own ODDSound master and we’re very careful with your spec and used synths which did ODDSound continuous retune. But it wouldn’t be a cakewalk.

Thanks for answering! Sorry I can’t be any help.

1 Like

This is very important! And something more people should be paying attention to (Surge obviously works just fine, but many don’t).

1 Like

Sorry for not being quite so on top of this but I’m fully immersed in removing the old engine at the moment and it’s a lot more involved than I originally thought.

For now, I would probably just keep your method as a fork.
I’m really worried about baking in one specific method here and then everyone adopting something else. There’s already been several approaches discussed here with none the clear leader in the field.

I also don’t want to add or change anything specifically related to juce::MidiMessage. I’ll almost certainly be replacing the MIDI pipeline in Tracktion Graph with choc::MidiMessage (i.e. the real-time processing part) at some point so any adoption shouldn’t be tied to juce.


The other thing I wondered is if you actually need changes to Tracktion Engine at all? You have the ability to set custom properties on the note’s state, why can’t you use this to also add sysex messages to the MidiList? That way they’ll just get automatically created in the current method without you having to modify the Engine at all?

If it were me I’d probably hide these from any UI but when a MIDI note is added/removed/moved, refresh the sysex messages in the MidiList so they’re sent at the right time in the the audio graph.

Would this work for you?

Thanks Dave for your answer!

You have the ability to set custom properties on the note’s state, why can’t you use this to also add sysex messages to the MidiList?

Ok, that should be an acceptable solution for the moment.

One thing though: when I tried this, the events were re-ordered so that the SysEx events preceding noteOns ended up immediately after noteOn instead. Is this expected behavior? (I noticed in the TE source that noteOff events are actually moved slightly to avoid colliding with the next noteOn; this implies to me that there is some non-stable sorting going on somewhere, otherwise that wouldn’t be needed)

I don’t think the sorting is non-stable but is mixed with other MIDI sources so that nudging might help then. To be honest through, it’s probably more a work around for synths which don’t like note-off/on events to happen at the same sample position.

I think the reason they’ll appear after the note-ons is that they’re added after the note-on/off messages when creating the MIDI sequence. If these are programatically generated sysex messages can you try generating them slightly ahead of the note-on? Around 0.0001s would usually do it.

Oh, now I see – MidiList stores SysEx and Controller events separately from the notes. That of course means that any internal ordering of events at the same beat position is inevitably lost.

If these are programatically generated sysex messages can you try generating them slightly ahead of the note-on?

Yes, it looks like I will have to do it that way.

I completely understand and respect your hesitation of adding specialised features like microintonation into TE! At the same time, IMO this sorting issue stresses the need to have a generalized built-in way of doing these kind of things, to avoid resorting to various hacks or needing to patch the TE source…

A thought: if it was possible to hook into the conversion process from MidiList to MidiMessageSequence, that would probably solve most of these issues! Imagine addToSequence having an optional callback (or calling a virtual method that could be overridden) that could be used to customize its behavior – something like

juce::Array<juce::MidiMessage> noteToNoteOn(const MidiNote& note)
{
  // As default, return only a noteOn message.
  // Customized version could return CC + noteOn or
  // SysEx + noteOn + SysEx for instance.
}

This way MidiNote wouldn’t have to be patched either; the callback/virtual method could check for custom values in the note’s ValueTree.

The problem with any library, especially a framework like Tracktion Engine, the more customisation points you add, the more difficult becomes to maintain and change the internals over time. It also takes a lot of thought as to what implications these changes have and if there is a better way to do something. Just because there is an “option A” on the table, that doesn’t mean it’s necessarily the best, it might close off other doors in the future.

One thing I am going to add soon is an audio graph customisation point. Probably createNodeForEdit will become a std::function you can replace. This means you can make an audio graph in whatever way you want from the Edit data model.

If these are programatically generated sysex messages can you try generating them slightly ahead of the note-on? Around 0.0001s would usually do it.

I just ran into a problem with this approach: if a note starts at position 0, it is not possible to place a SysEx event before it because gives it a negative time (hitting an assertion in getEventsChecked, called from MidiList::getSysexEvents).

Is there a way to work around that (apart from the obvious solution to offset all notes)?

If I just made MidiList::exportToPlaybackMidiSequence (juce::MidiMessageSequence&, MidiClip&, bool generateMPE) const; customisable, would that sort all your problems?

Can you check if the default implementation doesn’t rely on any private member variables?
If so, I’ll think of the best place to make that customisable (probably in the EngineBehaviour).

If I just made MidiList::exportToPlaybackMidiSequence (juce::MidiMessageSequence&, MidiClip&, bool generateMPE) const; customisable, would that sort all your problems?

I think so! That would also seem to be the most obvious entry point from my perspective.

Can you check if the default implementation doesn’t rely on any private member variables?

I made a quick check now and I unless I missed something, it doesn’t refer to any private members. (It does call the overloaded static addToSequence functions though)

If so, I’ll think of the best place to make that customisable (probably in the EngineBehaviour).

Wow, that’s great news! :smiley:

Ok can you see if this works for you: GitHub - Tracktion/tracktion_engine at feature/midi_list
If so, I’ll merge it to develop.