Graph node and player confusion

I’m really stumped on how to use graph nodes. I see that I need to create a node player to play back my custom node. But how do I add my node player to the graph? In other words how do I tell the graph about my node player?

I think you might have it backwards, you give your NodePlayer a root Node (which is a graph of Nodes if there is more than one) and then use the NodePlayer to play it back.

Have a look at the tests in the module as most of them use a NodePlayer (admittedly wrapped in a lot of test boilerplate) to process the audio and test the output.

Maybe I’m asking the question wrong. I see in the tests you are subclassing Node but I don’t see where any are actually being used by anything in the module. For example ForwardingNode, SinkNode, MidiNode etc. How do you actually implement a node in the Edit? I’m can’t get away from thinking you add them to a track like a plugin.

In the graph docs (part 2) you indicate:
You build a graph of nodes and then initialise and play back the graph.
Where are you initializing and playing back the graph?

Are you creating and manipulating an Edit and want to play it back? If so, yo don’t need to know anything about the Node, NodePlayer classes and tracktion_graph module really, it’s all handled internally.

I thought you were building your own app with the tracktion_graph Node classes but that doesn’t sound like the case now?

Perhaps I can ask what your high level task is?

Well I would just like to abuse it :slight_smile:
I had great success with the old AudioNode class but after updating to the latest tip a lot has changed. It seems that using the new graph is no longer optional (though still experimental I guess).
I was doing non-linear playback with a combination of my own plugin on each track that played a custom AudioNode. Now the Plugin class has it’s on PluginRenderContext and no longer uses AudioRenderContext so some of that functionality has also changed.
So I’m doing a bit of re-factoring and before I dive in, I wanted to fully understand the new graph engine with hopefully some example on how it’s used out side of the Engine.

Well you can still use the old AudioNode class, you’d just have to create an AudioRenderContext to process it. The problem was it was limited in two main ways, 1. it didn’t utilise multiple cores that well, 2. it wasn’t possible get get perfect PDC in all routing situations.

The new graph::Node framework solves both of these problems but it does come at the price that it’s a bit harder to setup and render. It’s not considered experimental anymore. I don’t think there are any references to “experimental” left in the Engine are there?

No- I think I may have saw “experimental” in the docs. But those may be slightly out of date.

I get an “incomplete type” error when trying to create or use AudioRenderContext unless I include tracktion_AudioNode.h. But more than that, I was also abusing the MidiAudioNode class as well. It no longer exists- which brought me down the graph rabbit hole.

Ah ok, yes I need to update the docs but I’m in the middle of refactoring a bit so will do it after that.

TBH if you’re already using MidiNode and it’s working for you I would probably just take a copy of AudioNode and MidiNode and put it in your own namespace. Then you know it won’t change underneath you.

Absolutely, I hadn’t considered that.
Thanks