Sorry if this is clear elsewhere or has been answered before, but a scan of the documentation, tutorials and forum posts hasn’t cleared anything up for me. I would appreciate it if someone could please spare a few minutes to share some recommendations or advice with respect to the issues I share below.
I have cobbled together a bunch of ambisonic/spatializer plugins, and they seem to work quite nicely in the audio plugin host. My aim is to have these items working in Unity, and eventially (ideally) on the Oculus Quest 2 (the the spatializer could also be really handy for anyone making a first-person shooter for example).
My first question is about interfacing with definition flags. Unity supports the creation of spatializer and ambisonic processors via the setting of some definition flags in the
InternalRegisterEffectDefinition method. Though I can see in the
juce_UnityPluginInterface header there is a definition for
enum UnityAudioEffectDefinitionFlags it is not clear how I should manipulate these from the context of my plugins project. I think the bit of knowledge I am missing is probably about adding extra code for different exporters. Could someone please suggest how I could set these flags in my Unity build at the appropriate time?
My second question is about the Unity suppor in JUCE more generally.
There are several methods in the unity native audio plugin API related to getting and setting for distance attenuation, and there is a state structure that contains useful information that I need to pass to my spatialization algorithm. I suspect the answer the this question is the same as the first, but how should I interface with these methods from the context of my plugin? Has anyone got an open-source example of how to interface with such methods in a plugin?
Finally, In the Unity native audio SDK there is an example project where an external application with shared memory can be run, audio can be passed back and forth between Unity and this external application with intermediary plugins. Has anyone used JUCE in a similar way, to create an external application that interfaces via shared memory to external plugins? I am guessing that in order to bypass Unity’s restrictions and support higher order ambisonics, this kind of external application will be needed (for research and experimentation purposes of course). I am trying to ascertain if this will be possible, accepting it will also be a nightmare and not expecting such a process to work natively on a Quest2 for example.
One idea I have had is to compile my JUCE projects as static libraries that I link into manually-created unity native audio plugins, therefor bypassing the juce unity interfacing and managing the builds separately. This will include the challenge of fiuring out how the build the juce library and the unity plugin together in a nice way, but giving the benefit of more likely support for ARM/Android, which is a target platform. Can anyone recommend if this is an appropriate approach for my goal please?