I asked this in the Tracktion open source thread but thought it could use its own top-level discussion.
In order to do this I need spatial audio support in my audio library. I don’t want to use Unity’s audio engine as this experience requires the lowest possible end-to-end recording latency (it’s a live looper intended for live performance). But of course Unity’s audio engine already has built-in support for sound spatialization.
What would be pretty amazing would be if Juce could support spatial audio, by adding the ability to refer to HRTFs, assign locations to sound emitters, etc.
Is this anything on the JUCE roadmap? Or is spatial audio something that JUCE plans to leave entirely up to Unity? I really, really don’t want any C# or managed code in my audio hot path and might wind up doing everything as a Unity native audio plugin running JUCE, but I don’t yet understand how that would interoperate with low-latency recording through JUCE, or how Unity native audio plugins interact with Unity’s spatial audio support.
Thanks for any clue-stick-whacking you may wish to provide