How to access the global time in Tracktion

Well I’d put a breakpoint in the two PlayHead::play methods which both have the line speed = 1; to see if they’re called and also maybe stop (where it’s set back to speed = 0) in case you have some code stopping the playhead.

I can’t see how you can get any playback if speed == 0.

Hi Dave, I’ve been away for a few days but returning to this now.

I’ve managed to fix the problem with the playhead being turned off, but getPosition() is still returning 0.0.

The playhead problem: PlayHead was being started, but then stopped again, from within my prepareToPlay() function, which did this:

void prepareToPlay(double sampleRate, int expectedBlockSize)
{
    hostedAudioDeviceInterface.prepareToPlay(sampleRate, expectedBlockSize);
}

I looked at the documentation for HostedAudioDeviceInterface, and it said to not create that object directly. So instead, I now call engine.getDeviceManager().getHostedAudioDeviceInterface().prepareToPlay(sampleRate, expectedBlockSize);, and the PlayHead doesn’t get stopped.

The new problem: now when PlayHead::getPosition() gets called, the referenceSampleRange atomic seems to be always set at 0. I can only see one place where referenceSampleRange is assigned to, which is inside setReferenceSampleRange(). This function is called when I call transport.play(true) from the AudioProcessor constructor, but indeed the range argument is 0. I assume that there is something else I am missing in the initialization. Can you advise?

Ok, so just to clarify, you’re this is a plugin using the Engine?
In a plugin, you would normally sync to the host time. Do you not want this behaviour?

Have you looked at the EngineInPlugin demo to see how this is set up? It might help you figure out where to prepare etc. at least?

Yes, it’s a plugin, but I’m running it as a standalone right now. The EngineInPlugin demo is where I got most of this setup from, but I’d forgotten about it, so thanks for reminding me.

Looking in there, I see that I was missing an ExternalPlayheadSynchroniser. I’ve added that to my class, and put the synchronize() function inside processBlock(). It’s still not working though. Looking inside the synchronize() function, I see that getPlayHead() is returning null. I guess that this means that the host hasn’t supplied it? Right now, it’s running with the JUCE_USE_CUSTOM_PLUGIN_STANDALONE_APP=1 wrapper thing. Does this not have that functionality?

Long term I’d like to be able to run it both as a plugin or as a standalone, which presumably means being able to switch between using Tracktion’s own timeline and syncing to the host. Short term I’m just looking to get it running again, so that I can continue debugging. So if you can tell me how to enable Tracktion’s own timeline, that will be very helpful.

The problem you are having is that the stand-alone wrapper does not supply a playhead.

What I resorted to when facing this same issue was to simply use the engine transport functions when standalone. I detect which wrapper is being used. This required me to make an entire transport ui for my app. It was pretty simple.

This sounds like exactly what I need. Do you have any examples I could follow?

The demos in the repository have great examples of manipulating the transport and tempo with a ui. Maybe take a look at the Playback demo. But if you have specific question or you’re stuck, just ask.

The UI stuff isn’t an issue for me, but I still can’t work out how to get the playhead working, even after going through a lot of tutorials and documentation. I’m also getting a bit confused between juce::AudioPlayHead and tracktion_engine::PlayHead. Should I be trying to supply a juce::AudioPlayHead to the juce::AudioProcessor, then synchronizing it using tracktion_engine::ExternalPlayheadSynchroniser? Or should I be somehow notifying the Edit that it should be using its own timeline, not the external one?

I think the problem you’re running in to is that you’re using trying to use a ExternalPlayheadSynchroniser to synchronise with an empty juce::AudioPlayHead.

If you’re running standalone with no juce::AudioPlayHead being provided, you should just ignore the ExternalPlayheadSynchroniser and use the te::TransportControl to start/stop playback etc.

The tracktion_engine::PlayHead is really an internal class that you shouldn’t need to access directly.

Thanks for your help Dave. I’m edging closer to a solution, but I’m still not quite there yet. I’m fairly sure that the problem is with how I’m initializing the engine and the plugins, rather than with the synchronization stuff.

I found that I was missing two important lines of code in the AudioProcessor constructor:

deviceInterface.initialise({});
transport.ensureContextAllocated(true);

This has fixed the problem where referenceSampleRange inside of Playhead was 0, but now I’m back to speed being 0 again!

Debugging the play and stop functions as before, I find that the stop function is being called from within my prepareToPlay() function:

void prepareToPlay(double sampleRate, int expectedBlockSize)
{
    setLatencySamples(expectedBlockSize);
    
    deviceInterface.prepareToPlay(sampleRate, expectedBlockSize);
}

If I comment out the last line there, the Playhead doesn’t get stopped and I can finally access the getPosition() function, as desired. However, getting rid of this line obviously isn’t the right thing to do, and I immediately hit some assertions elsewhere.

So at this point, my question is: why is deviceInterface.prepareToPlay(sampleRate, expectedBlockSize); calling the Playhead to stop, and what can I do to restart it?

For reference, my AudioProcessor constructor now looks something like this:


    addPlugins();
    deviceInterface.initialise({});
    transport.ensureContextAllocated(true);
    transport.setLoopRange(tracktion_engine::Edit::getMaximumEditTimeRange());
    transport.looping = true;
    transport.position = 0.0;
    transport.play(false);

Well if you change the sample rate the whole EditPlaybackContent (which includes the PlayHead will be destroyed and re-created.

Presumably if you’re running standalone you have some stop/start/position controls in your UI?
Otherwise, after deviceInterface.prepareToPlay, you probably want to set the position and call transport.play again (you might also need to call transport.ensureContextAllocated(true); first).

Does that make sense?
I think you’re assuming that the playhead is always running but if you think about a DAW (which is what Tracktion Engine is), that not usually the case.

You’re right that in my app the timeline is always on, and that has probably clouded my thinking. Anyway, I have finally got it working, with the following code:


void prepareToPlay(double sampleRate, int expectedBlockSize)
{
    setLatencySamples(expectedBlockSize);
    
    deviceInterface.prepareToPlay(sampleRate, expectedBlockSize);

    transport.setLoopRange(tracktion_engine::Edit::getMaximumEditTimeRange());
    transport.looping = true;
    transport.position = 0.0;
    transport.stop(false, false);
    transport.play(false);
}

It seems kind of disturbing having prepareToPlay() calling play(), but it works.

In tracking down the problem, I think I found a bug that might be of interest to you. While I was initializing it from the constructor, the transport was being stopped by some callback, probably coming from the deviceInterface.initialise({}); line. However, this reset seemed to happen internally, bypassing the ValueTree “playing” property. This meant that subsequent calls to transport.play() were dropped, because the ValueTree would filter them out (i.e. it thought the property hadn’t changed, so it didn’t bother sending a message to the listeners). That’s what I’m having to call stop() and then start() inside of prepareToPlay(), so that it definitely gets the message.

Anyway, thanks to all your help, I’m back in business!

Ok, thanks for the bug. I think I’m going to be refactoring that transport state stuff in the not too distant future so I wouldn’t worry about it for now. Cheers though.