High MIDI Latency with External Hardware in Tracktion Engine (Mac, Low Buffer Size)

Hi all,

I’m experiencing unexpectedly high latency (~100ms) when using external MIDI hardware with Tracktion Engine, even at low buffer sizes (64 or 128).

I’ve created a simplified test app that:
• Loads a 4OSC instrument on a track
• Enables MIDI monitor mode on all MIDI input devices
• Routes MIDI input to that track

When playing notes from an external keyboard it feels pretty squishy, the exact same hardware setup in GarageBand or Reaper is very tight.

Does anyone have any advice on how to reduce this latency?

I’ve created a repo for the test project: GitHub - simonadcock/tracktion-midi-latency-test

In summary:

MainComponent::MainComponent() : engine("Latency Test")
{
    // Audio setup
    if (juce::RuntimePermissions::isRequired(juce::RuntimePermissions::recordAudio)
        && ! juce::RuntimePermissions::isGranted(juce::RuntimePermissions::recordAudio))
    {
        juce::RuntimePermissions::request(juce::RuntimePermissions::recordAudio,
                                           [&] (bool granted) { setAudioChannels(granted ? 2 : 0, 2); });
    }
    else
    {
        setAudioChannels(2, 2);
    }

    // Create an edit and track
    edit = tracktion_engine::createEmptyEdit(engine, {});
    edit->ensureNumberOfAudioTracks(1);
    auto track = tracktion::engine::getAudioTracks(*edit)[0];

    // Add 4OSC synth
    if (auto plugin = edit->getPluginCache().createNewPlugin(tracktion::engine::FourOscPlugin::xmlTypeName, {}))
        track->pluginList.insertPlugin(plugin, 0, nullptr);

    // Enable MIDI input and routing
    for (auto& midiIn : engine.getDeviceManager().getMidiInDevices())
    {
        midiIn->setMonitorMode(tracktion_engine::InputDevice::MonitorMode::on);
        midiIn->setEnabled(true);
    }

    edit->getTransport().ensureContextAllocated();

    for (auto instance : edit->getAllInputDevices())
    {
        if (instance->getInputDevice().getDeviceType() == tracktion_engine::InputDevice::physicalMidiDevice)
        {
            instance->setTarget(track->itemID, true, &edit->getUndoManager(), 0);
            instance->setRecordingEnabled(track->itemID, true);
        }
    }

    edit->getTransport().play(false);
}

System Info

  • MacBook Pro 2021 (Apple Silicon)
  • macOS + CoreAudio
  • JUCE 8.0.6
  • Tracktion Engine 3.1.0
  • Xcode 16.2

Thanks in advance!
Simon

My guess is that this could be the fault of the JUCE CoreAudio handling - if you open both an input and output device, it adds some extra latency to sync the two devices up. A quick way to try that is to try it with only an output device, and no input channels open - if that’s better then this is your problem.

Thanks for the quick reply, Jules.

I tried disabling the input channels using setAudioChannels (0, 2), but it didn’t make a noticeable difference.

I also tested with a Focusrite Scarlett interface to rule out the built-in audio device, but this also had the issue.

I’ve just built a similar test app using only JUCE (routing MIDI input to a manually generated square wave), and that produced tight playback - so the issue seems isolated to the Tracktion Engine. I’ve tested this with 4OSC, the Tracktion Sampler plugin, and several VST instruments, and all of them have a similar level of latency (~100ms with a buffer size of 64).

Well, I can say for certain that the engine will run with optimal latency, because I used it in this:

This is our live MIDI playback app, so obviously we double-checked everything latency-related for it.

The reason I suggested the CoreAudio thing above is that when doing StageBox, we did initially have mysterious latency issues, and tracked it down to the JUCE CoreAudio implementation. Luckily StageBox doesn’t need audio in, and with the input devices disabled, everything works nicely.

TBH, looking at your code, you’re using the juce::AudioAppComponent, and I don’t trust that to not interfere with Tracktion’s device manager control. Before deciding that the problem isn’t what I suggested above, I would recommend putting a breakpoint in

and have a look at what CoreAudioIODevice is doing, because that’s the culprit that adds a buffer.

The correct way to not open an input is by creating a custom trackion::engine::EngineBehaviour, and implementing the shouldOpenAudioInputByDefault() method. That’s why I do in StageBox, and it makes sure the whole engine is aware of what’s going on.

..and actually, while I’m here in the forum, and some of the JUCE devs might be reading, this CoreAudio thing is a bit of an issue for us, because it means that in e.g. Waveform, the live latency might be crap, depending on which audio devices the user has open.

An option for us in Tracktion is to abstract out the audio device layer, so we can replace the JUCE device i/o with my choc RtAudio stuff. But would be easier if the JUCE code could be improved to solve the issue. (And to be fair, it was me that originally wrote the class that’s at fault.. I think I always planned to one day go back and tighten it up..)

Thats .. interesting .. did choc grow out of a fork of JUCE or was it more just work you did on the side after JUCE, which grew into a more functional framework?

It’s completely unrelated to JUCE, it’s just some free stuff I’ve released over the last few years.

Ok thanks Jules. I’ll do some more digging.