iOS AUv3 version of examples /PluginSamples/Arpeggiator

I see there’s currently no target for iOS - for an AUV3 version of the Arpeggiator MIDI effect plugin under examples/PluginSamples/Arpeggiator

Now that iOS11 supports MIDI output plugins and MIDI effect AUv3 plugins might it be helpful to have a verified, tested kosher version in examples for this target ?

Yes good idea. Don’t have an iDevice with me today but I’ll make sure to test this tomorrow or Friday.

FWIW - I tried creating am Arpeggiator target for iOS from within PROJUCER and built it in Xcode.

It compiled ok and got installed on my iPad simulator. Fired up as standalone.
However when I launch the examples Plugin Host, compiled for iOS and installed on the simulator ( and on my own iPad ) the plugin host app doesnt seem to see Arpeggiator as an AUv3 plugin.

This also seems to be the case when f

I’m not sure how i can test my own MIDI effect AUv3 plugin really on iOS. Particularly on the simulator - which is easier to develop on for obvious reasons.

Arpeggiator also doesn’t appear in the official Apple example project AUv3Host.

I’m afraid AUv3 does not support midi effect plug-ins. After about a day of investigations there are still multiple issues on Apple’s side:

  1. Although you can happily register any AUv3 plug-in type (including kAudioUnitType_MIDIProcessor), the appex handler code expects there to be at least one audio bus - regardless of plug-in type. If not the appex handler code will bail out and AudioUnitInitialize always returns -50. You can bodge your way around this by adding an output bus with zero channels but then the plug-in will fail AU validation on macOS. AUv2s are allowed to have zero buses.
  2. JUCE’s AUv2 and AUv3 hosting code uses the AUv3 to AUv2 bridge. To get the midi output we use the kAudioUnitProperty_MIDIOutputCallback property to inform the plug-in which callback to call. Unfortunately, after some reverse engineering I see that kAudioUnitProperty_MIDIOutputCallback will always fail for a AUv3 if the plug-in type is not a midi controller effect or a synth.

I therefore don’t see how this can be fixed without Apple officially supporting AUv3 midi effect plug-ins.

  1. to my recollection ( from the apple devcon conference video or somewhere else - i forget ) it does seem that Apple did require that there was a single bus and that audio render processing was necessary because MIDI timing info needed to be derived and synchronised from and to the audio stream. i’m happy to just receive audio on a single channel, do nothing with it - and put out silence on the audio channel. But don’t know how this affects current AUv3 hosts that now claim to support AUV3 MIDI effect plugins like AUM.

I think you may confusing the hosting of midi effect plug-ins. In AUv2 the AU does not require any audio buses (which you can check with auval). However, when hosting a midi effect auv2 plug-in you must call the AURender functions as if there was a single output audio bus with a 0 channel layout, so that the plug-in can receive timing information. But again, this is only from a hosting side.

As I said, I can certainly bodge the AUv3 code to always have an audio output bus, but I don’t see a way around point 2 in my post above - i.e. needing to claim that the AUv3 is a synth. This would mean that the AUv3 would also appear as an instrument in GarageBand which would be a little confusing to the user.

1 Like

iOS developer Bram Bos - of RuisMaker has announced a collection of AUv3 MIDI sequencers - and i’m pretty sure he’s writing them as synths.

This would be like your arpeggiator - so it would be good if you could at least make the Arpeggiator work as that. It will cover some bases.

And in the meantime i guess - regarding proper MIDI effect apps ( ones that can be chained for example - like the MIDIFLOW iOS apps currently available for the AudioBus3 MIDI page ) we will just have to wait for a future WWDC :frowning:

What might be worth adding to the /examples/ collection eventually - would be a MIDI-Effect plugin AUv3 example. Such as a transposer or scale/mode note-conformer.

But of course this needs to wait for Apple to get its act together - in an iOS11 update - or more pessimistically - iOS12

Jonathan Lihledahl of Kymatica ( AUM, Audioshare ) has just done a little looking into this matter of MIDI effects and ill just share it here:

“well, an AU can already take midi input and produce midi output, but I guess you want kAudioUnitType_MIDIProcessor to make such plugins discoverable?
to present them in a different way to the user compared to synths and effects, before they are loaded into the host?
for example a specific “midi effects” list
There is a “tags” array in an AUv3’s Info.plist. One could invent a custom “standard” tag for MIDI effects…can’t find how the host would retrieve this “tags” array though…
…ok, it’s in AVAudioUnitComponent. One way is also for hosts to present any AU that has both midi input and output as “MIDI filters”. But it would then include synths that also make sound.”

I guess current devs such as ROLI - AUv3Host devs and MIDI plugin devs who want to implement midi-effect functionality could collectively agree to use and look for such a particular tag - until such a time as Apple gets its arse in gear.

Yeah but the problem is that all those plug-ins would still appear as synths in GarageBand no matter which tag you use, right?

Do you guys have any communications channels or contacts to the iOS audio team at Apple? maybe if enough iOS audio app devs, like you , Kymatica and the AudioBus and - for example Bram Bos - pressure them - something might happen.

Although Apple might not want to make such big changes to GB iOS to actually support AU midi effects tagged like this properly - they could at least make a small change to their GB code so that any AU synth with this new tag simply gets ignored in the synth-list until such a time arises when they decide to change GB to support them properly.

Doesnt all this issue - and the basic fact that Apple has dragged its heels for over a decade about properly supporting AudioUnits on iOS - suggest that it really might be worth ROLI reconsidering its position of only supporting Apple official API’s in the JUCE Framework?

AudioBus - clearly even now - is way in front of what AUv3 can deliver. And there’s the non-apple API Ableton-Link as well - firmly established in the iOS music-making world - and elsewhere too.

So surely - bearing in mind JUCE is intended for AUDIO software developers - surely - it makes sense - and helps us developers - if ROLI’s JUCE team put work into supporting both AudioBus3 and Ableton-Link now ?

Better to support the leading-edge of what is now possible on the leading platforms instead of Apple’s trailing-edge - reminiscent of the “Please sir” scene in Oliver Twist!

+1 for a iOS AUv3 MIDI example.

I’ve started from AUv3 Synth example and that works ok as AUv3 synth on iOS, but sadly no MIDI output available on iOS. Same plugin as AU on MacOS show MIDI output and it works as expected. My plugin produces both audio and MIDI.

If you could update the example to show MIDI output on iOS (you need iOS 11), that would be super awesome!

EDIT - the example works, just had to change the target to iOS 11. Super awesome it is!

1 Like

the very latest version of AUM now - in the last few days ha been updated to support MIDI output only devices with kAudioUnitType_MIDIProcessor.

And Bram Bos (Ruismaker) and Jonathan Lihljedal at Kymatica (AUM) have found and are working on several ways ( that don’t break apple API guidelines ) to get actual non audio MIDI Filters ( or MIDI Effects ) that use kAudioUnitType_MIDIProcessor working.

Currently Garageband wouldn’t doesnt support filters but its highly likely that AUM will have something out - and they plan to let others know how they did it ( its not any bodge ).

And imo - its much more likely that the kind of user who really wants to use MIDI Effect/Filter apps is going to be the kind of iOS musician who is using AUM rather than GarageBand. Because such filters arguably have more uses in “live-jamming” experimental
situations than linear DAW-type composition. AUM is pretty much a de-rigeur must-have purchase - similar to how everyone on PCs or Macs might have MS Word :slight_smile:

my own prototype using kAudioUnitType_MIDIProcessor now appears in AUM - whereas it didnt a few weeks ago. I’d still really appreciate having an iOS11 version of the Arpeggiator there though.

Maybe someone at the JUCE Team might like to liaise with Kymatica and Bram Bos and find out how they got it working.

any chance of a few pointers on how to get this working? I’m sure I can figure it out myself but seems foolish to spend time re-inventing the wheel you’ve all figured out

Right now there is no “official” midi effect AUv3 support. You can, however, simply create an AUv3 Synth with both midi input and midi output (you’ll still have an audio bus - but just silence the output). The important thing is that you change the target to iOS 11.

Once we get some official response from Apple (we are in contact with them), we will add official support for this in JUCE.

1 Like

I’m having issues getting AUv3 recognised on Mac OS or iOS - do I need to sign the code to make them work? (I’ve been holding off paying for dev licenses until I’ve done a bit more dev work)

Yes, it needs to be signed and sandboxed. And you need to launch the standalone wrapper app at least once. Since high Sierra it seems that you may also need to reboot.