Midi Sequencer design: data structures and synchronization

Hello forum. I am building a midi-sequencer/ editor with juce with the goal to learn/ improve my C++ skills and hope somebody can help me clarify some questions about the basic design.

I have got the basics up and running, including a playhead for a standalone mode and a method for dispatching midi note messages based on sample-counting. Visual representation of the playhead und basic controls for transport and loop points. A gui-component for creating & editing notes in a grid. A Timeline and methods to scroll the editor (with viewports) and zoom in&out…

I am building this as standalone app as well as a vst3-plugin, using cmake. The programm is running surprisingly well with only minor bugs, as far as i can tell. But this is probably rather luck because I don’t have tests yet or push for edge cases. Because I don’t think the basic design is a solid foundation i would like to adress some questions before proceeding. Basically they all revolve around the following:
1. How to store the midi-messages (data structures) on the editor side, and
2. on the processor side,
3. How to update the processor side thread- & realtime safe

My basic setup is currently as follows:

1. The midi note-events are  represented by a class that inherits from juce::Component so that they can be drawn & manipulated in the note grid parent component. The note-components are stored in a vector that is owned  by the editor. 

2. The processor side owns a double buffer/ array of juice::MidiMessageSequences.  The class that manages the buffer has an atomic<int> to point to the sequence that can be read from in the audio callback. 

3. For synchronization I have a fifo that takes commands created in the editor. The commands are executed at the beginning of every audiocallback.
User changes on the editor side trigger an update to the processor side along the following lines: 
1) iterate the vector with the event components, create corresponding on & off midi messages + give them timestamps
2)  store the messages to a midiMessageSequence (local to the function),
3) get a pointer to the sequence of the double buffer which is currently not to be read from by the processor,
4) copy the newly created sequence (assign the dereferenced buffer-pointer to the sequence)
5) send a command to the fifo to ‚swap‘ buffers/ indexes so that the next audiocallback reads the updated sequence

It’s hard for a beginner to see through all the template and inheritance stuff in juce, but if I understand correctly, juce::MidiMessageSequence objects are heap-allocated with dynamic size? So I guess they reallocate at some point when objects are added, making them effectively not suitable for the realtime thread?
Would it be ok to use a std::vector instead, set a size (reserve(x)) at the the beginning and take care, that this upper limit on allowed number of midi messages is not exceeded at any time?

On the gui-side I am experimenting with juce::ValueTree because of the obvious advantages (saving state, undo-manager…). Is it a good idea to additionally represent the midi noteOn events als value trees (and add them as Childs to a common parent value tree)? In effect I would end up with two collections representing the same data on the gui-side: the vector with the components and the value tree. That would also necessitate to connect them somehow. Maybe adding a pointer to every component, pointing to the corresponding value tree?

Last but not least: I have watched the 2019 ADC-talk „realtime 101“ but honestly couldn’t see which pattern suits my problem of synchronizing message- and realtime thread best.
So I ended up with the pattern described above which is loosely inspired by stochas sequencer (GitHub - surge-synthesizer/stochas: The Stochas Sequencer). Because it looks kind of wild to me and in the source code there is a comment, pointing out why it is not 100% thread safe I am still on the hunt for a good solution. Is there a proven known solution to this problem anyone is willing to share with me?

Thanks to everybody who took the time to read all this. I am grateful for any piece of advice.

A few notes here.

  1. Coupling a MIDI note to a juce::Component is a really bad idea for many reasons. Memory/performance is one of them (and you’d take quite a hit over it).

I would suggest to use a much simpler data structure that can have a better layout in memory, and avoid using inheritance so you can lay it out with a regular contiguous vector. Then, you can create Components based on elements from that vector.

  1. Having written quite a few sequencers, I think a sequencer would be better not storing MIDI events like note on, note off, etc.

Instead you should store things in a more logical structure, for example hold the time range of start/stop for a single event. That is useful for multiple things: one is that you can always move notes around and switch sequences, and you won’t ‘lose’ any notes off.

The other is you’ll see that in the future you may want to store additional elements in the note beyond what MIDI data can express, and trying to condense that information to MIDI events would be difficult.

So, my suggestion is to store the high level concept of the note, and then on the fly generate MIDI events only for the current block.

I do live streams on my channel recently coding a rhythm game, and one of the elements there is a sequencer. It may bring some inspiration to you:

1 Like

Thanks a lot. I will take my time to consider what you said, watch the video and come back later.