If a lot of our users are using Unity and want mobile plugin support then it’s definitely something we will consider. This initial support is just to gauge whether it’s a feature that a lot of people want.
This is a really awful workflow Unity has setup. In my synth plugin for Unity I bypass this with a separate native call and use a “channel” value to send the parameter change to different instances.
Thank you Jules’ Utility Class Extensions! Im gonna have to test this right now! Good work, seriously perfect timing.
Is this in regard to effects plugins for the unity audio mixer? I did a few tests using FAUST and PD/HEAVY to compile native audio plugins for unity that are accessed through c# control scripts that can be applied to any game object and the Faust one is FaustPugin with a subcomponent for libFaustPlugin_ and the PD-Heavy one is AudioPlugin_, I guess maybe this isn’t strict since faust seems to be getting away with their own naming convention.
I would be curious to try this out. What Linux distro should I set up on virtualBox? I wonder if this would work on raspberry pi or other embedded linux. Could make for a fun project
Does this imply JUCE -> Unity is utilizing a specialized Unity Editor based GUI render component instead of the standard JUCE UI? This could have some implications I would like to look into if so.
One solution for the custom GUI caveat is to have a corresponding C# script generated as well as the custom GUI. I realize that would still not save your changes made from the custom GUI, so maybe for developers that want to work in JUCE and make VST and Unity plugins from one project, there could be a separate GUI view for the unity version where they would isolate the visual feedback elements that would be useful, and any control of settings would occur through the unity sliders? I feel like there must be a work-around for this involving the unity custom editor API. This page about native audio plugins, most of the custom GUI components from the SDK demos are for visual feedback : https://docs.unity3d.com/500/Documentation/Manual/AudioMixerNativeAudioPlugin.html
This thread by the creator of Helm (Awesome open source synth with a corresponding unity asset, @mtytel has found a pretty solid workflow that you can see in his tutorial videos on the unity asset page. you design the sound in the stand alone app or plugin version and then load your saved preset in the unity editor version.
If there is no outgoing GUI thread possible in the current unity native audio API, maybe we could collectively petition a feature request? That seems like it would be a useful thing and open up the market for more custom unity editor extensions and porting of audio plugins. If anyone could get that feature on the docket I would think it would be the legendary JUCE
Faust and Heavy have found ways to deal with mobile, The documentation for heavy use for unity here: https://enzienaudio.com/docs/index.html#06.unity states the following:
Building for iOS
iOS builds are unique in that the source code is needed when compiling the Unity game for the device.
Once the game is working as expected in the Unity editor (the macos plugins are required for that) and the heavy C# scripts are correctly attached to the gameobjects, go to File > Build Settings in the Unity menu. Then generate the Xcode project by selecting iOS platform, adding the required scenes and clicking Build. Open the Xcode project from the directory previously selected in Unity’s build menu. Create a new group in the left-hand side Project explorer section, the group name is not important but in this case we’ll called it heavy. Download the Unity source target from the patch compile page and copy the contents of the source/heavy/ folder into the heavy Xcode group that was created previously. Make sure to copy every file."
Yes, it seems to be pretty strict - I couldn’t get Unity to find any audio plugins that weren’t prefixed with “audioplugin”.
Unfortunately this won’t work as the Unity editor requires OpenGL version > 3.2 and VirtualBox and Parallels only support v2.something. This is really frustrating and is why I couldn’t test it out.
Not a Unity component, no. The JUCE code just renders a bitmap of the plugin editor to a block of texture pixels from Unity. This is the method that does the rendering and here is where the C# script calls back into C++ and passes a
void* texture pointer.
Interesting, thanks for the links. I will look into this.
You sure about that? On my late 2012 iMac running Parallels 13
The Rendering tests work for 3.2 (but nothing after that)
Yeah I think 3.2 is supported on Windows but my Linux VM only supports 2.1.
In the “Plugins with custom GUIs” section it looks like the Unity Native Audio Plugin SDK API has a get and set method, and the included example for the EQ supports some advanced editing through the custom GUI. It appears that maybe the custom GUI sets the values of the Unity Editor sliders, but seems pretty powerful mix of “best of both worlds”.
@ed95 I’m very happy to see some movement in this direction, but it’s not something I see myself using until this caveat is somehow addressed. I totally agree with @mtytel here. It’s shocking that Unity doesn’t allow this kind of interaction.
Did anybody get in touch with the unity devs about that? I’d imagine there are reasonable devs as well, who might be able to improve that?
Both @mtytel and I have posted on the Unity forum about this in the past. Our concerns seemed to have fallen on deaf ears. I’ve taken a big step back from Unity as a result. I have to say however, that I haven’t looked into this area in quite some time, but if @ed95 also reports issues with automation on custom GUIs then it looks like the problem is still there. Might be nice if ROLI reached out to them. It would be to Unity’s advantage. Heck, why not turn the screw a little and release a UE4 plugin that does allows automation. Apropos, their audio API has got a lot better in recent releases. I imagine supporting JUCE plugins there may be not be that tricky.
@ed95 Really nice new feature Did I miss something or the unity plugin generated via Juce produces an empty shell? I just tried and the VST and AU builds are OK with the compiled code visible inside the plugin via ‘show package content’. But the unity generated plugin does not contain any compiled code… Does anyone also have this issue? Or I did something wrong?
On macOS it should build a
.bundle package which contains the executable in
Contents/MacOS - do you not have this?
@ed95 It builds the .bundle package: it contains the C# script, the Contents/ folder with Resources/ inside but no MacOS/ folder and consequently no executable inside. This is strange because the same XCode project builds AU and VST which contains the MacOS/ folder and executable inside (and they work fine in ableton). By the way, I just tried to build the “Hello world” plugin starter code from the ProJucer.
Are you using the latest JUCE from the develop branch? You’ll need to rebuild the Projucer and make sure that the JUCE modules that the project is using are up to date. It sounds like your building against an older
juce_audio_plugin_client module that doesn’t contain the
You are wright, I think I was building against an older version of the modules… Now it works, thank you! Will verify this next time By the way, I had an error when I compiled the projucer plugin starter project (see the picture). I commented the line to make it work, maybe it’s normal because it’s the develop branch but I inform you just in case…
Are you sure your JUCE modules are all the same version? This looks like an error from a mismatch…
Yes, all JUCE modules are those of the develop branch.
And the project that you’re trying to build is definitely using these modules? If you open the project in the Projucer and then navigate to the modules pane you can see where each module is located.
Yes, I changed the modules path in the pane and they are all the same: the path of the develop branch.