Communication channels between audio-processor and its editor (instead of directly accessing objects)

Iiuc, SOUL-Lang based plugins, AAX DSP plugins, WPAPI plugins, and even old school Audio-Units distributed with Logic Nodes, all require communication channels between audio processors and their editors, which may run on different machines, and so are unable to access the corresponding object directly via pointers in memory.

I’m certain that there are parties who have strong interests to add AAX DSP and WPAPI to JUCE, but unfortunately they have no clear path to do it now because JUCE doesn’t have the relevant APIs for the required communications. If JUCE would add the necessary channels, AAX DSP and WPAPI support could be added to JUCE and it will ultimately be a big win for the live audio crowd which will get a better selection of available plugins.

This would also promote cleaner separation of GUI code from the audio processing/logic/data layers.


I am not an experienced programmer. I am setting up synchronized ValueTrees. Will this be legit for most purposes?

This thread reminds me of Vinnie Falco’s VFLib, specifically the concurrency module: .

With this system, in the most simplest terms, you would have a CallQueue and set values into it from your main thread (eg: UI thread), and read them out in your audio callback (eg: processBlock). There’s a lot of scaffolding but I’m sure you can figure it out if you’re curious.