Experimental support for Unity native audio plugins on the develop branch


With commit 527625b, JUCE now supports building Unity native audio plugins for Windows and Mac standalone builds. This format can be enabled like any other plugin format in your project settings in the Projucer:

The plugin will be compiled to a .bundle on macOS and a .dll on Windows which can then be imported and used in the Unity editor and standalone builds of your game. The Projucer will also create a C# script (generated in JuceLibraryCode) to handle your plugin’s GUI and this will be copied into the .bundle on macOS or next to the .dll on Windows. This means that you must also manually import this file on Windows alongside the .dll (on macOS it is contained in the .bundle so you don’t need to do this). If your plugin does not have an editor (ie. AudioProcessor::hasEditor() returns false) then this script will provide Unity’s default sliders for any AudioParameters that your plugin exposes. If it does have an editor then the script will draw this to the Unity inspector:

Some caveats:

  • Unity requires all audio plugins to be prefixed with “audioplugin” so if your project name does not start with this, “audioplugin_” will be prepended to .bundle or .dll name when building.

  • Currently only desktop platforms are supported - Linux builds should work but I have no way of testing them and would really appreciate it if anyone is using Unity on Linux and could try this out!

  • The custom GUI paint routine is not very efficient and the GUI refresh rate from Unity is quite slow so UIs will look a bit jittery (this improves somewhat when you are in playmode).

  • When using a custom GUI there is no way to expose your plugin’s parameters to scripts and snapshots will not save/recall parameter values.

Please post any feedback in this thread.

Unity3D Game Engine with JUCE Audio Backend



what happens if you deploy a game for mobile/emscripten using that plugin?


This is very exciting, when you say ‘custom gui’ do you mean a Unity gui?


Would love to see some other native audio Unity plugins. It’s a bit lonely at the moment!

@ed95 Curious how you got the UI drawing. Did you pass a graphics context down to native code? Didn’t know you could do that.


Currently only desktop standalone builds are supported, the plugin won’t work on mobile platforms.


@DaveH By ‘custom GUI’ I mean the GUI that is drawn by the C# script if your AudioProcessor::hasEditor() method returns true, as oppose to the standard sliders that Unity will create for your parameters if that method returns false.

@mtytel The UI drawing is done by passing a pointer to a block of Texture2D pixels back into native code and then drawing a bitmap to this. As I mentioned, it’s not very efficient and I’d like to improve it by using an OpenGL texture instead at some point.


A custom GUI there is no way to expose your plugin’s parameters
Gaming is obviously very script orientated. So, say, if I want to change a bespoke room reverb size from a game script, I can’t do it?


is that planned?


No, not if you are using a custom GUI. However if your AudioProcessor::hasEditor() returns false then Unity will just create default sliders for any AudioParameters that you have added to your plugin via AudioProcessor::addParameter(). You can then right-click on these sliders in the Unity editor and expose the parameter.


If a lot of our users are using Unity and want mobile plugin support then it’s definitely something we will consider. This initial support is just to gauge whether it’s a feature that a lot of people want.


This is a really awful workflow Unity has setup. In my synth plugin for Unity I bypass this with a separate native call and use a “channel” value to send the parameter change to different instances.


Thank you Jules’ Utility Class Extensions! Im gonna have to test this right now! Good work, seriously perfect timing.


Is this in regard to effects plugins for the unity audio mixer? I did a few tests using FAUST and PD/HEAVY to compile native audio plugins for unity that are accessed through c# control scripts that can be applied to any game object and the Faust one is FaustPugin with a subcomponent for libFaustPlugin_ and the PD-Heavy one is AudioPlugin_, I guess maybe this isn’t strict since faust seems to be getting away with their own naming convention.

I would be curious to try this out. What Linux distro should I set up on virtualBox? I wonder if this would work on raspberry pi or other embedded linux. Could make for a fun project

Does this imply JUCE -> Unity is utilizing a specialized Unity Editor based GUI render component instead of the standard JUCE UI? This could have some implications I would like to look into if so.

One solution for the custom GUI caveat is to have a corresponding C# script generated as well as the custom GUI. I realize that would still not save your changes made from the custom GUI, so maybe for developers that want to work in JUCE and make VST and Unity plugins from one project, there could be a separate GUI view for the unity version where they would isolate the visual feedback elements that would be useful, and any control of settings would occur through the unity sliders? I feel like there must be a work-around for this involving the unity custom editor API. This page about native audio plugins, most of the custom GUI components from the SDK demos are for visual feedback : https://docs.unity3d.com/500/Documentation/Manual/AudioMixerNativeAudioPlugin.html

This thread by the creator of Helm (Awesome open source synth with a corresponding unity asset, @mtytel has found a pretty solid workflow that you can see in his tutorial videos on the unity asset page. you design the sound in the stand alone app or plugin version and then load your saved preset in the unity editor version.

If there is no outgoing GUI thread possible in the current unity native audio API, maybe we could collectively petition a feature request? That seems like it would be a useful thing and open up the market for more custom unity editor extensions and porting of audio plugins. If anyone could get that feature on the docket I would think it would be the legendary JUCE :stuck_out_tongue_winking_eye:

Faust and Heavy have found ways to deal with mobile, The documentation for heavy use for unity here: https://enzienaudio.com/docs/index.html#06.unity states the following:

Building for iOS
iOS builds are unique in that the source code is needed when compiling the Unity game for the device.
Once the game is working as expected in the Unity editor (the macos plugins are required for that) and the heavy C# scripts are correctly attached to the gameobjects, go to File > Build Settings in the Unity menu. Then generate the Xcode project by selecting iOS platform, adding the required scenes and clicking Build. Open the Xcode project from the directory previously selected in Unity’s build menu. Create a new group in the left-hand side Project explorer section, the group name is not important but in this case we’ll called it heavy. Download the Unity source target from the patch compile page and copy the contents of the source/heavy/ folder into the heavy Xcode group that was created previously. Make sure to copy every file."


Yes, it seems to be pretty strict - I couldn’t get Unity to find any audio plugins that weren’t prefixed with “audioplugin”.

Unfortunately this won’t work as the Unity editor requires OpenGL version > 3.2 and VirtualBox and Parallels only support v2.something. This is really frustrating and is why I couldn’t test it out.

Not a Unity component, no. The JUCE code just renders a bitmap of the plugin editor to a block of texture pixels from Unity. This is the method that does the rendering and here is where the C# script calls back into C++ and passes a void* texture pointer.

Interesting, thanks for the links. I will look into this.


You sure about that? On my late 2012 iMac running Parallels 13

The Rendering tests work for 3.2 (but nothing after that)



Yeah I think 3.2 is supported on Windows but my Linux VM only supports 2.1.


In the “Plugins with custom GUIs” section it looks like the Unity Native Audio Plugin SDK API has a get and set method, and the included example for the EQ supports some advanced editing through the custom GUI. It appears that maybe the custom GUI sets the values of the Unity Editor sliders, but seems pretty powerful mix of “best of both worlds”.


@ed95 I’m very happy to see some movement in this direction, but it’s not something I see myself using until this caveat is somehow addressed. I totally agree with @mtytel here. It’s shocking that Unity doesn’t allow this kind of interaction.


Did anybody get in touch with the unity devs about that? I’d imagine there are reasonable devs as well, who might be able to improve that?