can someone share the tracktion engine thread models ? I have some questions:
I known the MidiClip::createAudioNode will be export a midi message sequence based on current state(clip start). I can not understand why when I drag clip(update start) to new position on waveform application, the playing midi message sequence also follows the changes.
I’m not quite sure I follow the question… this doesn’t seem do have anything to do with thread models? Perhaps you could be more specific on that front?
Regarding the MIDI clip, when you move the MIDI clip’s start position, wouldn’t you expect the playback to also follow?
I’ve tried the waveform app, and when it’s playing, drag midi clip, and the sound that plays follows where it was after the drag. I would like to know how this is done. Because I see that when create midi audio node, it copies midi events to midi audio node, not very easy to understand how this relates to when the clip position changes.
I’m learning about trackation engine and I think its equivalent to waveform, when I see a drag clip inside a waveform application, it automatically handles the midi events with the new configuration without having to re-pause the playback.
You mentioned rebuilding graph nodes when the clip position changes above, but I can’t see from the code how the clip->setStart() method triggers the rebuild nodes. maybe it’s waveform application side logics?
There’s a lot of complex logic that deals with all of this and it happens asynchronously so could be difficult to trace.
Essentially what happens though is when a property that will affect the audio graph changes, the Edit::TreeWatcher picks that up and asynchronously calls TransportControl::restartPlayback which rebuilds the audio graph.
If you put a breakpoint in EditNodeBuilder::createNode you should be able to see that stack.