Ableton Link Tutorial - How to build and some tips

Hey guys,

After working a while on using Link in my projects I decided to update my example I posted here while learning it. It is improved somewhat in accuracy and simplicity. I have also included a full guide with some tips on how to use Link in your own projects. Here it is:


Cheers,
Ian.

13 Likes

Hey Ian thank you for this great resource. I’m playing around with your code and while it works great, I noticed the that if I set up the device with audio inputs ( i.e. setAudioChannels (2, 2); instead of setAudioChannels (0, 2); ) then the timing is slightly out of sync with other peers in the session. There’s a small but noticeable lag that gets added. I wonder if you would have an insight on how to compensate for that in your code?

Thank you!
Julien

First, measure the lag. If it is constant, i.e. not jittery, then it could be that the audio device is introducing some additional latency. Meaning that you will need to add additional samples to the value that is used to calculate the final output latency.

I vaguely remember that the resource I made only estimates the output latency and since then JUCE improved their latency reporting which I now use in my current projects. You will have to look at how you can get these latency values from tracktion instead of simply using the current audio block size which does not account for input latency.

E.g., in my current project, when an audio device is about to start, I calculate the latency and store it in a custom “link” object’s “out_latency_ms” member:

    class Link : public ableton::Link
    {
        friend class DeviceManager;
        
    public:
        //======================================================================

        using us = std::chrono::microseconds;
        using SessionState = ableton::Link::SessionState;
        
        //======================================================================

        static constexpr const double
            quantum = 4.; //======================================================================

        Link() noexcept : ableton::Link{ 120. } {}
        auto out_time_us() const noexcept-> us { return out_time_us_; }
        
    private:
        //======================================================================
        
        ableton::link::HostTimeFilter<ableton::link::platform::Clock>
            host_time_filter;
        
        us
            out_time_us_{},
            out_latency_us{};
        
        JUCE_DECLARE_NON_COPYABLE_WITH_LEAK_DETECTOR (Link)
    };
    

When the audio device is about to start, I’m simply asking the JUCE api for the device’s output latency as opposed to how I was doing it in the resource you were using where I was using the number of samples in the current audio block. This is probably what’s causing your problem since that value may not account for input latency. This is where you have to find out how to get this values out of the tracktion API which account for both input and output latency:

void DeviceManager::audioDeviceAboutToStart(AudioIODevice* device)
{
    const auto output_latency_samples = device->getOutputLatencyInSamples();
    const auto sample_rate = device->getCurrentSampleRate();
    const auto out_latency_sec = output_latency_samples / std::max(1., sample_rate);
    link.out_latency_us = Link::us{ std::llround(1.0e6 * out_latency_sec) };
...

and then in the audio callback you simply have:

void DeviceManager::audioDeviceIOCallback(const float** in_data, int num_ins,
                                          float** out_data, int num_outs, int num_samples)
{   // Audio thread
    const auto time_sn = processor.refer_engine_play_head().time_sn; // this is your internal PlayHead sample position which needs to be offset as it is currently out of sync
    const auto host_time_us = link.host_time_filter.sampleTimeToHostTime(time_sn); // this is the conversion from samples to microseconds that is calculated by the host time filter
    link.out_time_us_ = host_time_us + link.out_latency_us;
...

Ok, now we have the out_time_us stored. You will use this value all over the place in Link’s API to query the current beat, bpm, phase etc. or make requests to change tempo/playback state, etc. wherever you need it in your audio chain.

Example calls

// Capture the session.
/*If you must capture the session from a non-audio thread, 
then use captureAppSessionState() while also ensuring that
the out_time_us value has been retrieved in a
thread-safe and realtime-safe manner.*/
auto session = link.captureAudioSessionState(); 

// Get the beat
const auto beat = session.beatAtTime(out_time_us, link.quantum);

// Get the phase
const auto link_phase = session.phaseAtTime(out_time_us, link.quantum);

// Make an initial sync request call
const auto ph = get_internal_play_head(); // Your internal playhead object that stores the current bpm, position, beat, etc.
session.setTempo(ph.bpm, out_time_us); 
session.setIsPlaying(ph.is_playing, out_time_us); 
session.requestBeatAtTime(ph.beat, out_time_us, link.quantum);

P.S. any resulting latency from this method that is measured between connected peers should be below 3ms. Otherwise it should be considered an error.

4 Likes

Ian, this is great stuff; I just wanted to say thanks. Implementing a Link-enabled transport had been on my to-do list for quite a while, and your examples and sharing your experience have saved me tons of headaches.

2 Likes