Unity3D Native Audio Plugin support


#1

Hello.

I was looking at a presentation yesterday about the new audio mixer in Unity3D 5, and there is an API for native plugins.

http://docs.unity3d.com/Manual/AudioMixerNativeAudioPlugin.html

https://bitbucket.org/Unity-Technologies/nativeaudioplugins/overview

 

Is it possible to add this in the Introjucer like the VST or AU format?
 


#2

Interesting... would be very cool to be able to use top-quality reverbs and filters etc in Unity apps.

And they have the Unity Asset Store for selling add-ons.


#3

 Gahhhhh - they use Mercurial...!

Is it possible to add this in the Introjucer like the VST or AU format?

Though I'm sure you didn't mean for it to sound this way, but the issue/idea is really much more complicated than that: an entire audio plugin client would need to be designed...

There are many non-obvious but really important pieces to that puzzle:

  • Where are the actual SDK specifications?
  • What's the minimum set of requirements to set this type of plugin up?
    • e.g.: Which functions do we need to export for each plugin DLL?
  • There's this weird PluginList.h file they have: does this mean all plugins are considered shells?
  • The code doesn't seem to provide a window handle to paint onto (goodbye JUCE Component based GUI?)
  • What's with all those *.meta files in the Assets folder?
    • Are they important?
    • Do we have to generate one for every plugin?
      • If so, which one(s)?
    • Where are their specs?
    • What does it even mean to have all of these asset things?
    • Why are there, at times, two files per assets: 1 meta file and 1 asset with a magic extension?
  • There's a ProjectSettings folder with *.asset files
    • Are they important?
    • Do we have to generate one/many for every plugin?
      • If so, which one(s)?
    • Where are their specs?

#4

Seems I can answer one of the my many questions by looking here: http://docs.unity3d.com/Manual/AudioMixerNativeAudioPlugin.html

The GUI which is developed in C#. Note that the GUI is optional, so you always start out plugin development by creating the basic native DSP plugin, and let Unity show a default slider-based UI for the parameter descriptions that the native plugin exposes. We recommend this approach to bootstrap any project.

C# - Wtf?

No GUI for you!


#5

Any news regarding Unity native audio plugins? I can imagine this to be very useful (in the sense to have a single codebase for all plugin formats, including Unity).

Developing a Unity native audio plugin at the moment, I may be able to answer some of the questions:

  1. There is a documentation that explains the API specifications (maybe not available at the time this thread was started): https://docs.unity3d.com/500/Documentation/Manual/AudioMixerNativeAudioPlugin.html

  2. The PluginList.h defines all the plugins that go into one DLL (or bundle, static library, etc.). It’s perfectly fine to export just one plugin per DLL, though that probably limits the use of common resources.

  3. Yep, no JUCE component based GUI. But in the context of Unity, the GUI is less relevant, as it’s just for the game developers, not for the final product (which is of course the game or VR application).

  4. It seems the .meta files are auto-generated, so no need to worry about them (except for being annoying in the context of version control…)

  5. The .asset files are generated by Unity, so no need to worry about that, either. However, that raises another issue: how to bundle data to be used with a plugin? I guess a good strategy is to provide functions that let a plugin read data (such as impulse responses for a convolution reverb) from the Unity editor (or the game engine in the final product) and not store them directly in the plugin.

It would be really cool if Unity support would come to JUCE at some point, especially as this platform is becoming more and more important as virtual reality becomes more popular.


#6

In case you’re interested I just ported my synth over to the Unity’s Native Audio SDK and released it on the asset store: https://www.assetstore.unity3d.com/en/#!/content/86984

After working with the SDK for a while, it seems like it would be a piece of work to support this format inside Projucer as another export format. It’s been pointed out there’s no native UI, also no MIDI interface (but I guess JUCE could get devices on its own), nothing to snap indexed (non-float) parameters, parameter names are cropped to 15 characters (that’s not in the docs anywhere), etcetera, etcetera.

Anyway I can’t really find any other native plugins on the Unity asset store, so if you release yours I’d love to see it.


#7

Interesting. I like the videos you made for your plugin. Did you make a GUI in C#, too?
And another question: I recently sent a prototype native audio plugin (a spatializer effect) to two different Unity developers and they didn’t really know what to do with it. How was your plugin received? Do the Unity users appreciate it? Does it fit into their workflow?


#8

The only UI in Unity I have is a graphical piano, piano roll and patch browser.
The workflow I’m telling people is to create patches using the standalone synth editor (http//tytel.org/helm) then export that patch to your project. Then they can change parameters from script or animation.

As for reception, it’s been really positive for the people that have bought it. Problem is, it’s hard to reach that niche of Unity users who like playing with synths. Not sure the best place to reach them.


#9

AudioHelm for Unity is very nice indeed (and it’s really great that Helm itself was open sourced), and has shown me the way in terms of how to potentially interface to my own unity plugin synth code, however as I started with a JUCE standalone project there are a few things that would be handy to maintain in my unity plugin - such as cross platform MIDI devices support/midi queue/buffer and the thread management stuff (ScopedLock in my audio render function seems important).

I’m very new to JUCE, so perhaps there is a way to extract just those few features I need for continued use within my unity plugin? (especially MIDI - I’m using MidiMessageCollector, and ‘popping’ the next block in the audio render calls before processing my synth notes/envelopes)

I was thinking I might ‘simulate’ midi messages from the unity world by pushing them into a MidiMessageCollector queue.
The idea being, if I used MIDI cc to control parameters, I could keep the control interface (and message queue) to the plugin sound parameters the same for both unity and external MIDI control surfaces/keyboards.


#10

Thanks!

In the Projucer you can select the juce modules you want. You probably want at least juce_core and juce_audio_devices for your ScopedLock and MIDI. Those modules are both ISC licensed so it’s free to use them.