Delay of Recorded Midi

I don’t think this is what you should need to adjust. In Waveform if you select a MIDI input that field is exposed as “Time Adjust” but it’s usually 0 or only a few ms. It doesn’t sound like this is what you should be reaching for.

Thanks for pointing that out. I’ll need to keep digging to find the correct method to adjust the midi.

When I have found the correct method I will need a way to set the value of the midi adjustment. What would be the right place to derive that value?

I have looked at a lot of code to track down this issue, including;

etc., ...

I cannot find anything that affects only the position of recorded midi, which makes me wonder if the missing thing in my code may be in JUCE itself?

Its very odd because in all other respects midi is working perfectly!

I’ve gone through all the code that I can find. But nothing I try makes any difference.

For example, I came across this code;

    bool handleIncomingMidiMessage (const MidiMessage& message)
        if (recording)
            recorded.addEvent (MidiMessage (message, context.globalStreamTimeToEditTimeUnlooped (message.getTimeStamp())));

        ScopedLock sl (consumerLock);

        for (auto c : consumers)
            c->handleIncomingMidiMessage (message);

        return recording || consumers.size() > 0;

But I cannot discern how the globalStreamtime connects with the midi. It would seem that an offset should be calculated and used somewhere. Obviously I am not understanding how this works.

And, just to restate the issue;
Midi functions perfectly in all other respects. It displays correctly, whether placed manually in the edit, or imported from exisiting midi files. And, it plays back as expected with the exception of newly recorded midi.

When midi is recorded, it records without issue. The notes are played, as expected, as the recording proceeds. But, upon hitting stop, the notes are placed incorrectly on the timeline. The notes are all ok relative to each other. But, they are “offset” on the timeline, by a varying amount of +/- a beat or so.

You can select all the newly recorded notes and drag them to the correct spot on the timeline, and everything is fine. But, I would hate to have to do this everytime midi is recorded!

Any guidance or ideas will be most welcome!

Thank you!


For now, I would concentrate on the midi recording demo. Make sure that the recording of midi works with it and that there is no delay. This is the only way to be sure that it is not the tracktion engine code. If you can’t see it without a drawn grid, then I would implement it. Nobody will really be able to help you if you cannot first exclude this source of error. Only when you know that the error is really in your part of the code can you search for it specifically. And look for a difference between the midi recording demo and your DAW.

There is nothing in my code that does anything except call the the TracktionEngine code that records midi, exactly as it is done in the Midi Recording Demo. In fact, you might view my DAW, as far as midi is concerned, as being an embellished vesion of the Demo. So, I am at a loss as to what to check on my side.

Please note: From the beginning, I do not think there is a bug in the TracktionEngine code. The problem, I believe, is that there is a setting that I do not know, and have been unable to find, that either provides an offset for recorded midi, or locks it to the edit timeline. That is what I need help with. And there is no way for me to know what that is without help from someone very familiar with the TracktionEngine internals.

I have spent days on this and have been unable to find anything. I am just hoping that someone more knowledgeable than me, might have a suggestion or perhaps can point me in the right direction.

I’m not convinced there is a setting you’re missing unless you’ve set any of the time adjust settings to something other than the defaults. By default MIDI should be pretty in sync.

I really need to know if the problem also exists with MidiRecordingDemo as that’s what I’ll need to debug with. You shouldn’t need to load your own Edits, just do a test like so:

  • Turn on the metronome
  • Arm an input with a physical MIDI device
  • Start recording
  • Starting with a specific key, play a note on each beat, ascending each time
  • Stop recording
  • Play back. Depending on which note and beat you started on, it should be fairly easy to determine how out of sync it is

But there’s so many variables, what sample rate and buffer size are you using? if you’re using a very large buffer size and a not-great sound card, it might be introducing large amounts of latency which is just throwing your recordings off. We have a recording loop-back test in Waveform (very similar to the one in the JUCE demo) to calibrate for this.

I will do as you suggest, and report back. It is very similar to what I have been doing in my DAW.

The issue occurs on two completely different systems, with very different hardware, buffer sizes, etc., although both are at 48K. In both cases, the midi seems to not be locked to the timeline even though the notes are all correct relative to each other, and so can be easily selected as a group and moved to the correct starting position. So, maybe when midi recording starts it is not getting the correct starting time?

I will perform the testing you suggested on both systems, just to be thorough.

Thank you!

For completeness, I wanted to return to this topic and relate my experience.

MidiRecordingDemo always records midi correctly in relation to the timeline. No problem there.

But, my DAW was showing recorded midi as “floating” in relation to the timeline. The individual notes are correct relative to each other. However, the start of the recorded notes was shifted either early or late relative to the timeline. And, I could simply select all the notes as a group and drag them to the correct spot to fix it, suggesting that somehow they were not locked to the starting point.

So, my first attempt was to see if I could create the same behavior in the MidiRecordingDemo. I did so by pulling out all of the classes into individual header only classes, which is how my DAW is constructed.

There was no difference until I changed the structure of the MainComponent.

In my DAW, the te::Engine is instantiated in MainComponent, the edit is created, edit->getTransport().ensureContextAllocated(); is called, and then the EditComponent is opened.

Doing the same in MidiRecordingDemo results in the same “floating” midi recording behavior.

So, then I refactored my code to instantiate the te::Engine in the EditComponent instead. And that works correctly! No more “floating” midi.

So, obviously, I was creating some sort of disconnect with the recorded midi by the way I was originally instantiating the te::Engine.

I would add the observation that it is odd that, in all of this, only recorded midi was affected. Playback/recording of audio, and playback of midi were always correct regardless of when the te::Engine was instantiated. This does seem to suggest that something is different about that part of the te code.

The bottom line is that moving the instantiation of te::Engine into the EditComponent has resolved the issue for my use case.

Thank you!

I’m glad that you’ve managed to work around this but your explanation sounds fishy to me. In Waveform, we create the Engine before any UI elements, Engine really should be the first thing to be instantiated (after the settings objects).

I’m struggling to think why this would result in timing differences fro both you and the MidiRecordingDemo.

Perhaps, I should ask what the ideal chain of events would be?

It is currently working to instantiate the te::Engine on the stack in the EditComponent.

More specifically, in my MainComponent, I choose the edit file to open (or create a new edit file), and pass the EdiFile to EditComponent.

In EditComponent, te::Engine and te::SelectionManager are created on the stack. The edit uses either te::loadEditFromFile(engine, editFile) or te::createEmptyEdit(engine, editFile) to create the std::unique_ptr.

Then, I use engine.getDeviceManager() to initialize the midi devices the same as in the demo. Followed by edit->getTransport().ensureContextAllocated(), the same as in the demo.

I then, also initialize the audio tracks, and do all the addAndMakeVisible for the GUI elements. Then its just buildTracks, and the edit is open.

This is all on Windows 10 (all updates applied), plus I download both Juce and TracktionEngine from develop branch daily.

The “fix” for me was moving te::Engine into the editComponent. That seems to allow midi recording to “anchor” itself to the timeline.

If there is a better way to do it, I will appreciate to learn what it might be?

I honestly don’t know what’s going on here. Engine should really be the first thing to be created. Imagine you have multiple windows or components looking at multiple (or even the same) Edits (like we do in Waveform). That wouldn’t be possible if the Engine was owned by a single Component.

I’d have to know what the problem is before saying how to fix it though and as far as I can tell we don’t have a reproducible test case. If you modify the MIDI recording demo to create the Engine before the window, does that show the problem then?

My use case is much simpler. I will never have more than one edit open. So, instantiating te::Engine in the EditComponent works fine.

In the meantime, I am trying get a modified version of the MidiRecording demo to exhibit the issue. So far, I have tried moving the te::Engine to Main.cpp, and then passing a reference to MainComponent. But that works as expected.

If I find code that “breaks” the MidiRecording demo, I will report back.

Thank you for your help wit this!

You say that now… That’s how Tracktion started out and we had to re-engineer loads of stuff to add multiple Edits in T5.

But there are more reasons to do it outside of your UI. Firstly, its just better to decouple it like that. You will have access to your app from most places but most functionality shouldn’t know about a main “component”. Secondly, what if you want to add a CLI to your app for rendering or running tests etc? You don’t want to have to open a UI to do that. It’s just good practice to separate these concerns in my experience.

Save yourself future pain by doing it properly now.

I hear you. I will refactor with this in mind.

Your knowledge and experience is always appreciated.

Thank you!

I finally have a reproducible demonstration of this issue. I can recreate the problem at will on two completely different systems, one with Windows 10, and the other with Windows 11.

This is done on juce/develop and TracktionEngine/develop branches, downloaded daily, and compiled with Visual Studio 2022.

Start with the MidiRecording demo. We are going to move instantiation of the te::Engine to main.cpp [Edit] in the Application class.

    te::Engine engine{ ProjectInfo::projectName, std::make_unique<ExtendedUIBehaviour>(), nullptr };

We need to add the following class at the top of the main.cpp file.

class MainComponent : public Component, public MenuBarModel
    MainComponent(te::Engine& e) : menuBar(this), engine(e)
        setSize(1000, 800);


    void resized() override
        auto layoutBounds{ getLocalBounds() };
        if (midiRecordingDemo)

    te::Engine& engine;
    std::unique_ptr<MidiRecordingDemo> midiRecordingDemo;
    MenuBarComponent menuBar;

    // Below is the menu system and menu support functions
    PopupMenu menu;
    enum MenuIDs
        Open = 1000,
        DeviceSetup = 3000

    StringArray getMenuBarNames() override
        return { "File", "Settings" };

    PopupMenu getMenuForIndex(int index, const String& /*name*/) override
        switch (index)
        case 0:
            menu.addItem(Open, "Open");
        case 1:
            menu.addItem(DeviceSetup, "Device Setup");
        return menu;

    void menuItemSelected(int menuID, int /*index*/) override
        switch (menuID)
        case Open:
            midiRecordingDemo = std::make_unique<MidiRecordingDemo>(engine);
        case DeviceSetup:
            DialogWindow::LaunchOptions o;
            o.dialogTitle = "Audio Device Settings";
            o.dialogBackgroundColour = Colours::darkgrey;
            o.content.setOwned(new AudioDeviceSelectorComponent(engine.getDeviceManager().deviceManager,
                0, 256, 0, 256, true, true, true, false));
            o.content->setSize(400, 600);

Now, in the Application class, the initialize function, change the line to read,

mainWindow.reset (new MainWindow ("MainComponent", new MainComponent(engine), *this));

In MidiRecordingDemo.h make the te::Engine member variable a reference, and modify the constructor to receive the reference and initialize the variable,

MidiRecordingDemo(te::Engine& e) : engine(e)

Finally, in the MidiRecordingDemo class function createOrLoadEdit add these lines after the edit is created, so we have clicks to follow;

edit->clickTrackEmphasiseBars = true;
edit->clickTrackEnabled = true;
edit->clickTrackRecordingOnly = true;

To see the “floating” midi, run the program and “File/Open” the demo from the MenuBar. Record midi to the click. It should match with the click. Close the program.

Now, run the program again and choose “Settings” from the MenuBar. Select a device other than the one you want. Then, change it back to the desired device. After this you can select File/Open from the MenuBar.

Add a new track, and record midi along with the click. The newly recorded midi will be off by plus or minus a beat or two. It will not match the first midi you recorded and will not match the click.

My work around is to simply not change the device settings until the edit is open. There is probably a better way.

And this definitely doesn’t happen if the Engine is a member of the MidiRecordingDemo?

Apart from the question above, I’d swear this was a problem with the sound card just not reporting its latency correctly. Without running a loopback test to measure this and correctly apply it, it sounds like the recording will just be off.
What buffer size are you using? Is the difference proportional to the buffer size? I.e. does choosing a really small buffer size give a bigger drift?

I will do the test you suggest.

The unmodified MidiRecordingDemo always records midi correctly.

But, please note, that I am not changing the buffer size in either instance. It is simply the default buffer size (whatever that happens to be).

The only difference, is that, using the modified MidiRecordingDemo, if you choose the device before the edit is opened, then you get floating recorded midi. If you choose the device after the edit is open, then the recorded midi is correctly locked to the timeline.

And, yes, it is the same selected device in both instances (with default settings). And the results are the same on two very different systems.

So, the question is, why is the midi locked to the timeline when the device is selected after the edit is open, but the midi floats when the device is selected before the edit is open?

I have done the test by selecting the device before the edit is open and with buffer sizes of 144 and 480. Both are about the same in terms of the midi float in relation to the timeline.

If I select the device after the edit is open, the buffer size makes no difference, the midi is correct in relation to the timeline.

Do I understand it right that in your failing case you’re creating the Engine object as a global variable? It’s a really bad idea to let the compiler statically construct such a complex object, as the order of construction of your program’s static variables is totally random, so the Engine class could be indirectly using all kinds of other statics from the rest of the program (and inside many JUCE classes) before they’re ready. Anything could happen!

If that is what you’re doing, it’d be much safer to create it either in your main() function (if it’s a command-line app) or in your juce::JUCEApplication::initialise() method if it’s a juce app. You could have a global std::unique_ptrte::Engine if you need it to be a global.

@dave96 Thinking about how we could prevent people doing silly stuff like that, I wonder if the Engine should do almost nothing in its constructor, and have an initialise() method that must be called before using it?