Can Unity Windows work with an external audio interface to load multi-track audio?

I am developing software that loads audio from multiple microphones and processes it in real time through an external audio interface.
I want to know if this is feasible at Unity.

If you have any experience or knowledge, please help me.

You can integrate JUCE with Unity - its a lot of work, but it can be done.

But you do know this is a JUCE forum, right? Unity support forum is that way: https://forum.unity.com

Yes, I know.
I am a Unity veteran developer.
However, I asked this question because I am not familiar with JUCE and audio processing.
Does it mean exactly that JUCE’s current Unity plug-in can load multi-track audio from the external audio interface?

“Load multitrack audio” or “stream it through a process graph for further processing”?

If you want to load multitrack audio files, I suggest you look at Tracktion Engine, which builds on top of JUCE and gives you DAW-like features such as edits, clips, project files, etc.

If all you want is audio i/o processing capabilities, perhaps a little more sophisticated than the base audio API’s for each OS you want to target, then yes - JUCE will give you access to all channels on a multi-channel audio interface, for processing.

The purpose of JUCE is to provide high-performance Audio I/O processing - multichannel, with configurable processing graphs through plugins, effects, etc. - out of the box. If you want more sophisticated, DAW-like features (Saving/loading multi-channel clip/edit-based project files), then adding Tracktion on top of JUCE is the way to go.

This is all explained pretty well in the tutorials:

Definitely worth investing a few hours to go through those before you get started …

Thank you for your reply.
I’m going to develop a software that uses Unity to process audio input from multiple microphones in real time, recognize voice, and convert it into text.
I’m asking the prior researchers because I want to determine first whether or not I can do this using JUCE’s Unity plug-in.
If that is possible, I will study JUCE further from now on.

I’d say go for it. While I’m not familiar with the JUCE Unity plugin (I’m an UE guy), I’m pretty sure that you’ll get what you need once you get it integrated into your project … JUCE is simply one of the most powerful audio i/o processing frameworks around.

Let us hear how you get along!

Thank you very much.
I will go as you said

You’ve got two options with JUCE in Unity.

  1. You can create a Unity Native Audio SDK plug-in, which works with Unity’s audio engine and audio mixer. In this case Unity acts as the DAW and your plug-in would be like Unity’s version of a VST/AU plug-in. The issue here is that you don’t have proper control over audio hardware and things like ASIO drivers. You are stuck with what Unity gives you for audio device access, which is very basic. But your plug-in can interact directly with Unity’s audio engine and DSP chain.

  2. You can bypass Unity’s native audio engine and just use JUCE from within Unity. This will work, and you will be able to access all the the hardware and low latency audio device drivers and hardware I/O, but it will be completely cut off from Unity’s audio engine. Any interaction you want to create between the two you will have to do manually. Sharing audio buffers is unlikely, but you can for example let C# invoke methods to control your C++ JUCE code via .NET interop, and also get data out of your JUCE code into Unity the same way. And to be clear you can share audio buffer data, but probably not in a way that you can effectively join the two audio engines.

1 Like

Thank you for your answer.
I’m sorry, but I’m not an audio programming expert, so I don’t understand a lot of things.
Is it possible for you to give me some examples and give me some details?
Thank you.