InputDeviceInstance as input of more tracks

can I use an InputDeviceInstance as input source of more than one tracks?

No. Currently an input can only be on one track. For midi, you can create a virtual MIDI input and put it on a second track.

1 Like

what a pity, anyway thank you!

But for example can I route an Audio track to others using that one as input instance ?

Or maybe could I do some hack to have an array of input device instances?

Are you trying to record the same thing on multiple tracks at once? There is currently no way to do that. If you just want the same audio on each track to run through plugins, maybe use a send?

1 Like

Really thank you for your replies and sorry for my many posts! You’re right send are good solution (infact It would be another thing that I’d like to understand how it works, maybe it could be subject of another topic), but my purpose is to create an app for live performance, a sort of main stage were a lot of track mybe have the same audio input but a different plugin configuration for example, so I could for example have:
GuitarRiffTrack
GuitarRitTrack
GuitarSoloTrack
And switch between those patches or play two of them simultaneusly (take case of a keyborder thato have PIANOTRACK ORGANTRACK STRINGSTRACK and he could switch between them or may play PIANOTRACK and STRINGSTRACK simultaneusly)

There is no one good solution to your issue.

For audio a send would probably work best. Put the input on track 1 and an aux send. Lower level to 0. Then put a return on each track the has effects.

For MIDI, create a virtual input for each track.

Neither are ideal solutions. We have plans to refactor the input device classes, put not schedule yet when we will get to that.

The other thing you may want to look at is HostedAudioDeviceInterface. It’s a class that makes a fake audio device to pass to the engine and then you can feed it audio manually. You could make one of these and then duplicate the audio channels before passing it to the engine. @dave96 is back from holiday on Monday, he may have a better solution for you.

1 Like

I understand that an AudioDeviceInstance is a bridge between the track and the physical audio source, and the tracks must own an instance of those AudioDeviceInstances which are as many as the physical audio sources. It’s right?

If its so may the AudioDeviceInstance should be a property of audioTrack (this because it is responsible of live monitoring of the track) and it should share his property “InputDevice& owner;” with other tracks, with inside a method "setInputDevice(InputDevice& inputDevice) { owner = inputDevice; } , so the user should iterate through InputDevices and assign one of them to the AudioDeviceInstance of the AudioTrack.

What do you Think about it?

Sorry I look around HostedAudioDeviceInterface but I don’t understand how to put into practice your idea :disappointed_relieved:

The audio device instances are owned by the edit, there is one in each edit for each physical input. Then an instance should be assigned to a track. This is the oldest code in the engine and needs to be refactored. One issue is that inputs have settings, like midi filters, trigger levels etc. If you assigned the same input to two tracks, would you expect those settings to be shared, or be unique to each track?

1 Like

Be unique because Musician should need different settings in tracks where his signal pass in

Sorry for yet another comment on this topic but I’m trying in every way to find a solution to this problem!

I think using AuxSendPlugin and AuxReturnPlugin I solved the problem, but this make it happens that I need a lot of more tracks where audio signals are routed among them and I’d like to ask you, who are more experienced, if this could have bad impact to my app performance.

let’s say I’m using audioTracks more as nodes so a lot of inside property are not used, may a track lives only to get signal from inputDevice and route it with some AuxSendPlugins to other AudioTracks.

Furthermore I think I could run into some limitations of tracktion_engine as static const int maxPluginsOnTrack = 16;. There is a way to increase this static const int without modify code in framework?

I also thought to implement my AudioTracks directly inheriting Track, but looking at source code I think I should change also Engine class and Edit class because on these everything refers to AudioTrack. Whats about this?

Thank you again!

I’m struggling a bit to visualise what you’re trying to achieve which makes it quite hard to give accurate advise…

The maxPluginsOnTrack is fairly arbitrary and we could increase it or make it user-definable but it’s really there as a sanity check as adding more than 16 plugins to a track usually indicates some improper use of something.

You can’t really create your own AudioTracks, there’s far to much internal work to get them playing nice with the rest of the Engine. I also can’t see what kind of things you’d need to customise them for?

1 Like

Hi! thank you for your reply!

I explain to you.

I’d like to have:

As many “InputAudioTrack” (that is a track that get only audio input from my audio interface) as the AudioInterface Inputs connected to myPc has (it can’t send audio to AudioInterface’s OutputDevices).

As many “InputMidiTrack” (that is a track that get only Midi input messages) as the midi input ports physically connected to my pc (or created through Driver IAC on mac).

Some “MainAudioTrack” that is a track that get only Audio signal from a single “InputAudioTrack”, but more than one “MainAudioTrack” could get signal from the same "InputAudioTrack (“MainAudioTrack” can’t send messages directly to a physical output device’s port).

Some “MainInstrumentTrack” that is a track that get only Midi signal from a single “InputMidiTrack” and in output can have audio produced from an InstrumentVst And/Or Midi messages, but more than one “MainInstrumentTrack” could get signal from the same “InputMidiTrack” (“InputMidiTrack” can’t send messages directly to a physical midi output port).

As many “OutputAudioTrack” as AudioInterface’s Output connected to myPc, that is a track that get only audio from “MainInstrumentTrack” or “MainAudioTrack” (“MainInstrumentTrack” and “MainAudioTrack” can’t be connected to outputDevices).

As many “OutputMidiTrack” (that is a track that get only Midi input messages from a “MainInstrumentTrack” and that can send messages only to one physical output Midi port) as the physical output midi ports connected to my pc (or created through Driver IAC on mac).

I was thinking to to this throw AuxSendPlugin, AuxReturnPlugin and tracktion_engine virtual midi port, but in this way an InputAudioTrack could be connected to maybe 20 or 25 MainAudioTrack (my app has live purpose as for example main stage: a guitarist for example could have 20 different sounds for live and each one is a track with inside some plugins, so he could switch between them or maybe play two tracks or three of them simultaneously) and in this way number of maxPluginsOnTrack can be easily overcome (add the possibility to make it user-definable should be a great idea, maybe adding a method like “useDefaultLimitation(bool)” or something like that).

I’d like also to ask you that are certainly more experienced than me if this approach could requests to system too resources because it’s a bit strange compared to the way tracktion_engine framework is thought.

Really thank you!

In practice, the most CPU-hungry things in an Edit are plugins, almost everything else is just routing audio buffers around.

It seems to me though that you’re overthinking your Edit construction. I’m not sure why you need all the Input/Output/Audio/MIDI tracks? Why not just create a track for each “sound” and then assign the relevant input/outputs to that? If you need the same input to more than a single track’s plugins, use an aux send/return pair or Rack to duplicate the signal?

I can kind of see why you think having dedicated tracks for all the inputs and outputs would be cleaner but I can see this being problematic when different audio interfaces or computers are used. The Edit files that this system would produce wouldn’t really be portable between machines.

Having said that, there’s no inherent “problem” with doing it that way, it’s just probably not how I’d approach it. But then I don’t really know what your product design looks like so it’s difficult to judge.

1 Like

Yes, I know it’s impossibile to create a track for each input and for each output, it should be created dynamically by the user and if he connects another audio interface that has less inputs of tracks he created the excess tracks shouldn’t have input (I explained in that way before to make things easy).

So if volume&panPlugin, levelMeterPlugin, AuxSendPlugin and AuxReturnPlugin are really not bad for cpu I could use them (maybe there will be 20 auxSend in a track :scream:).

what do you think about limitation of plugins and at this point tracks (I know I seem mad but I create every day live patch for bands as work and some real life project became really big).

Really thank you!

What I mean is that the internal plugins vol/pan, level meter, aux really don’t do much. They mostly just copy one buffer to another (the level meter perhaps does a bit more work). Compare this to even a basic synth and the CPU load is nowhere near it.

What would get me in this situation if you have a send possible for every combination of tracks is that they’re just there most of the time doing nothing and they don’t scale well. If you can add and remove them dynamically as part of your app then that’s a much better approach.
Or, you could use a Rack and internally change the routings to get the signal flow you want.


I think what you need to think about is how is you app going to look to the user, and then what does changes in the app’s interface mean to an Edit manipulation. For example, adding a track in your app’s UI could do a whole bunch of stuff in an Edit under the hood.

You kind of need to separate out the interaction from the Engine implementation.


I certainly wouldn’t worry about optomising it until you’ve got something working though.
The beauty designing your UI separately, perhaps with an API on the Tracktion Engine operations is that you could even change how it manipulates an Edit without changing all your UI etc.

1 Like

Ok, I’ll do some experiment, really thank you!

If I understand correctly, all these extra tracks and routing issues are a result of trying to work around the fact that one input can’t be added to multiple tracks, correct? If we solved that, it would solve your other issues?

1 Like