SOUL Lang with Flutter and Dart:FFI

In the last year I have been using Flutter, a cross platform app development framework which is really cool. Somewhat similar to React Native, although instead of depending on native UI components, you write dart code using the Flutter library and it gets compiled to machine code. This lets it be very powerful, performant and flexible. If you haven’t seen Flutter before I’d recommend checking it out!

When I first heard about SOUL Lang when it was announced last year, I thought that creating a Flutter app with a SOUL audio backend would be an amazing way to create audio apps.

Now that the SOUL beta has been released I’m trying to have a go at integrating them together.

There is a feature in beta in Dart called Dart:FFI which lets you call C code directly from Dart, and this can be used in flutter apps. This seems like it could potentially be a great way to make the link between Flutter and SOUL.

There are 2 ideas I had for a useful first step to take:

  1. Take one of the standalone soul examples from SOUL/examples/standalone, compile it to C++ using the soul generate --cpp command, and try to call this from Flutter thought Dart:ffi
  2. Try to create something similar to the SOULPatchHostDemo in Flutter

Do either of these seem realistic?

I had a go at idea 1, and the first blocker I hit is the following?:

If I compile a SOUL file, for example ClassicRingtone.soul to a cpp file, i.e. ClassicRingtone.cpp, how can I actually use this cpp file to make sound?

The compiled file has a ClassicRingtone class, which has a render method.
Is the idea that you have to call the render method every so often (e.g. every 10ms) to generate the next chunk of data to send to the speakers?
If so, do you have any idea of a minimal setup for doing this, and how I could do this from Flutter and send the output to the device speakers? Or is the idea that I should be using JUCE to do that?

(I saw this talk at the years ADC about integrating JUCE with React Native, by creating a JUCE app with a react native app inside it, so perhaps the same is possible with Flutter, although I haven’t come across anyone else trying this yet.)

Or is the idea that I should include the SOUL command line tool directly in the project, and then call these tools from Flutter to platy a patch?

I think what would be most likely to work well (when flutter has the right functionality), is if you were to take the SOULPatchHostDemo JUCE project and use a UIViewComponent to embed the flutterview. Flutter Desktop is still very alpha AFAIK. Maybe one day this page will include details about how to embed the flutter view on desktop

I think there are probably several ways then you could communicate between dart and the parent JUCE app, the FFI stuff being one of them

Thanks for the help!
The link you sent is helpful, I’ll try go down the route of embedding the Flutter app inside SOULPatchHostDemo!

I was aiming to build an app for iOS / Android, so am not too worries about using Flutter desktop right now.

in that case I think SOULPatchHostDemo is the wrong choice because there are no SOUL dylibs for iOS (you’re not allowed to JIT compile code on iOS). I am not sure what the status of SOUL C++ export is, but I guess that is what you need to look into

C++ export is still in a bit of a state - it’s useable but the interface is ugly. I’m tidying it up at the moment, so expect it to change in the coming weeks.

Ahh ok, I guess I should hold off on the iOS front for now then. I’ll have a go at hosting Flutter in a Juce app to try and get some kind of music app going, then at a later date will work on integrating SOUL

I have been doing some research regarding which platform to use Flutter vs React Native and the info you have posted here is very useful. Thanks a lot)

Awesome, I missed this thread, but am very much interested in the general idea.

I’ve been working on a very similar project which I call Blueprint that hosts a React.js application within your JUCE App/Plugin and allows the React app to delegate UI composition directives to JUCE. It’s a work in progress but already pretty far along– my latest plugin has an interface built totally in React.

My approach is almost exactly what @olilarkin mentioned– from my perspective there’s a ton of value in playing nicely with JUCE’s native wrappers, so that I can ship a product in plugin format, iOS format, desktop cross-platform, etc etc. We know from the SOUL team that they’re working on JUCE Apps/Plugins for hosting SOUL patches, which is great, and that means that if we tap in there, at the top level view Component for UI rendering, we get all the benefits of JUCE at the native level. That’s what Blueprint does with React.

FWIW I think Dart:FFI is probably the wrong way to approach this kind of integration (disclaimer: I don’t know Dart that well). FFI allows you to call into C code, for example, but that call will initiate from within the Dart VM. SOUL’s main goal is to render DSP, so to effectively initiate that from within the Dart VM you’d need to figure out allocating a high priority thread for real time rendering and wiring that to a driver or a plugin host etc. Maybe the Dart VM already has facilities for that, or maybe you can do all of that by wrapping the SOUL patch in JUCE and exposing your own C API into Dart:FFI, but in my experience when your native code starts allocating threads and stuff, the FFI game gets very complicated.

So, that said, I think you’re spot on with going the other direction: hosting Flutter inside a JUCE app. I think you’ll find it easier that way and also you get all those native benefits of the JUCE wrappers and such. I’d be curious to hear how it’s going!

Thanks for this info Nick, it’s very helpful!

I’m currently on vacation for a few months so haven’t made any progress since my last post, but am still interested in investigating it when I return.

I had a look at your blueprint project and it seems pretty cool! Am I right in thinking that it’s similar to react native in that it takes react code and renders it using native widgets, in this case juce widgets?

So I imagine the approach for hosting a flutter app inside juce wouldn’t be quite the same, as it wouldn’t use the juce Ui widgets at all. Or am I missing something?

A big part of the puzzle I’m currently missing is how to share data / state / events between the flutter app and the juce app. Something like:

Flutter ui element pressed -> event and data sent to juce app -> audio engine state updates -> confirmation message sent to flutter -> flutter ui updates

Do you know how this information flow might be achieved?

Tom O’Connor

1 Like

Yep, you’re totally right, Blueprint renders using juce::Component and various specializations of juce::Component.

For Flutter, and presumably any similar integration, you kind of have two options: one is to let the Flutter engine compute the view heirarchy and layout and then completely hand rendering over to some other engine (such as JUCE, as is done in Blueprint). The other option is to keep the entire UI pipeline in the Dart VM. In the latter case, you’ll end up doing something probably closer to @adamski’s React Native/JUCE integration: https://medium.com/@adam.wilson/react-native-and-juce-tutorial-6b5fad35ad50, or like what Tom Duncalf presented at ADC: https://www.youtube.com/watch?v=bsy0-mHcS4Y (don’t know his forum handle).

In this latter approach, you’re basically talking about letting the Dart VM run the show, but spin up a JUCE-based audio pipeline/thread for your audio rendering. I think in your case, the flexibility of the Dart VM and the FFI will determine how difficult that approach is. The reason I specifically didn’t go this way for Blueprint is because if I play into what JUCE has already built, I get all the plugin “mount points” and all the standalone app “mount points” just totally taken care of, but still get to write my UI in React.js. I imagine, for example, writing a VST plugin where the UI is actually managed by the Dart VM is a whole new can of worms.

As for event propagation and handling, Blueprint itself doesn’t exactly care about your state management, although my personal approach follows the Flux model of uni-directional data flow. The only thing Blueprint does for you here is propagating native events into the js environment. For example, when you use a <View/> in Blueprint, ultimately I’ll build a juce::Component for you to render that View, and that component instance will catch its mouseDown callback, check if your <View/> instance has its own mouse down callback, and forward the event to that callback in the js environment if so. At that point it’s totally up to you, the user, what you do with that callback and that state.

The way my GainPlugin example (see the blueprint repo on github) flows here is this: Mouse click -> juce::Component mouseDown handler -> dispatch function call in the js environment -> my js callback does some state management and then calls a native method that I had previously registered -> that native method just turns and calls setValueNotifyingHost on the appropriate parameter -> the valuetree updates -> the value tree listeners are notified -> one of the listeners is my own that broadcasts the new parameter value back to js. It sounds a little complicated, but it’s a pipeline that scales for all of my parameter handling with basically zero coupling and trivial effort, which makes things like custom macro controls exactly as easy as any other slider. Hmm… that’d probably make for a pretty neat video tutorial for Blueprint :slight_smile:

1 Like

Thanks again for the very helpful and detailed response.

I think the blueprint approach sounds very sensible, especially in the case where you want to be able to make a VST.

Would definitely be interested in any tutorials you make on the subject!

I guess as Juce has already solved the cross platform app problem, then it doesn’t necessarily make sense to try to involve another cross platform app framework, as I imagine there would be a lot of work to do for the different use cases, I.e use flutter
mobile on mobile devices, and somehow use flutter desktop for standalone apps and for VSTs.

However, in the case that you only care about making a mobile app, I think the flutter setup could still be cool to try out.

The talk you linked from ADC was one of the things that make me interested in this setup!

I don’t see any reason to pursue the case of using the flutter view hierarchy and referring to juce components, as I might as well just use Blueprint instead!

Tom O’Connor

Totally, that was exactly my thinking :slight_smile:

Yea, I think that’s totally fair! And the Flutter mobile setup (or React Native for that matter) are obviously very well developed solutions. I think for that route you’ll definitely be looking at the SOUL C++ export as @cesare and @olilarkin have mentioned, which I’m personally very excited to see as well

Maybe you found some solution in the meantime? If not, my new github project stub might be of help?

Flutter with JUCE - my solution approaching mobile accessibility

I haven’t found the time to make any more progress in the mean time, and to be honest was a bit out of my depth with the native iOS and Android config stuff.
Your project looks very interesting and I’ll check it out!!