OWL platform now supports SOUL

:trumpet:
Pleased to announce that the OWL platform now supports SOUL patches!

OWL aka OpenWareLab is an open hardware embedded audio platform with an online community of users and patch writers.

SOUL patches can now be easily compiled (using our online compiler and patch library) and run on commercial products such as Magus, OWL Pedal, Befaco Lich as well as DIY and hobby projects.

There is nothing particularly special about how this is done, we simply provide a wrapper for the generated SOUL C++ code that maps hardware controls (knobs et c), audio io and MIDI to the patch. Our online compiler (or offline build tool) produces both asm.js output that runs in the browser, and cross-compiles for our Cortex-M microcontrollers, packages the resulting binary in MIDI SysEx messages and sends it straight to the device which runs it dynamically.

The cool thing, to my mind, is that SOUL patches can be compiled unmodified to run on dedicated audio hardware - bare bones, no OS.

A first example patch is here:

More examples, docs, maybe even a video will follow (eventually!)

15 Likes

That’s very cool. How long does the compile → send → init process typically take?

About 5 seconds or so.

2 Likes

That’s quick!

Hey that’s very cool, and sounds like it’s working really well.

And i’ve just added an FM Electric Piano patch - https://www.rebeltech.org/patch-library/patch/FM_Electric_Piano

2 Likes

Our little Cortex-M4 processor is struggling with the polyphony - if you change it to a single voice it works well.

LatelyBass works monophonically too, lovely… :heart:

Side note:
I naively changed the number of voices on this line:

        voiceAllocator = TX81Z::PolyVoiceAllocator (-12, 1);

without considering the array allocation here:

        voices         = LatelyBassVoice[8];

I got this error:

error: Internal compiler error: ""object != nullptr" failed at operator*:59"

Of course my code is incorrect but that is probably not the error message you want to produce. Also the compiler exits with status 0 OK at this point.

Good bug spot, i’ll investigate. As for the performance, we can investigate and see what we can do.

I’ve fixed that bug in the latest version, which will be out in pre-release later today.

Performance wise, i think the problem is related to two things. I’m not sure that the M4 has double support? If not, it’ll be dropping back to software rendering float64s, which won’t be fun. I’d suggest trying replacing the float64s that appear in the envelopes with float32, as there are adds and multiplies for every sample for each envelope (4 per voice per sample). In addition, the calculations for the oscillators shapes are based on transcendental functions (sin) and this again is one calculation per oscillator per sample. Re-writing this as a lookup table will probably make that problem go away, but it’s a little trickier to do, so the float64 change is the first call i’d guess

The code was an attempt to see how small and neat I could make a TX emulation, rather than worrying about runtime efficiency, and these are the areas that probably hurt the most

okay great, thanks.

Yes the M4 has a single precision FPU. It would be nice to be able to specify single precision output from the code gen (feature request 1!).

We already provide fast approximations of trig, exp and log functions, but the way SOUL code gen works we can’t easily plop them in.

With FAUST we can generate code that doesn’t use the std:: prefix for math.h functions, so we can easily substitute them using macros.

Another method that works well for us is how Heavy does it (Pure data patches): the generated code calls e.g. hv_sin(x), and then in a header file they provide

#ifndef hv_sin
#define hv_sin(x) sinf(x) // or std::sin(x) if you prefer
#endif

This makes it easy for us to provide alternative implementations.
Could either of these options be adopted with SOUL?

Another feature request:
It is fantastic and impressive that the generated code does not use heap allocations. Huzzah! However on the downside, all required memory is contained within the main graph itself (unless I’ve misunderstood). With embedded platforms we often have non-contiguous memory, with different access times, so being able to allocate things like delay buffers separately can make a big performance difference. Is this on your roadmap already?

edit: this was supposed to be a PM, sorry!

I also have an M7 device here, which runs 8 voices TX with no problem. It seems that the M7, with higher clock speed and more internal RAM, is closer to the sweet spot for SOUL. We don’t have any products with this MCU in production, but I could maybe send you a prototype. Are you a Eurorack user, or would you prefer a desktop device?

Also would be good to discuss (putting this here so I don’t forget!):

  • output parameters (block rate CV)
  • parameter assignments, e.g. to hardware knob, and attributes
  • handling buttons and triggers with sample accuracy
  • device independent debug output
  • output MIDI from patch
  • MIDI message port id, for routing MIDI between USB i/fs

That’s a good request about allowing other trig implementations, I’ll see what I can do about that (while trying to avoid a ton of ugly macros if possible!)

We’re just having a bit of a chat over here about the feasibility of being able to globally set float64 = float32 and whether that’s a sensible/practical option we should allow.

I’m not sure what you’d need us to do about some of your other requests there, though. We can’t really help you with knob assignment, that’s entirely up to your host program to decide how to map those events to inputs in the soul program, which is pretty much agnostic about the kind of data you feed it.

Likewise, buttons + triggers are as sample accurate as your host wants to make them. For example, in our patch wrappers, we have a FIFO for sample-accurate incoming events, and we feed them into the renderer at the appropriate positions by splitting the rendering blocks up. That’s not something the soul code itself will ever do for you, it’s a hosting task. But if you’re struggling with it, you can get inspiration from classes like

which we use for this kind of multiplexing task.

Understood, yes, and apologies for a half-baked request. The longer version is:
It is frequently useful to embed host specific metadata, or attributes, in a patch. For hardware devices or reconfigurable GUIs it would allow assigning a patch parameter to a user control at the patch level, with the host wrapper remaining agnostic to the patch internals.
At the moment you have the .soulpatch metadata file, perhaps if parameter information was exposed here then custom attributes could be added in at this level?

I have seen that yes, and it is similar to how e.g. FAUST schedules incoming MIDI events. We will likely take a similar approach.

Of course - it’s just JSON so that different hosts can parse whatever they need from that file. If you want to have things in there which we don’t use, then you can go ahead and add them.

1 Like