If I am going to use a DAW then I dont see the need for threading to build a multitimbral synthesizer but if I want to target a rasberry pi then maybe a thread for each instrument seems like it might be a good idea. If the istruments are slightly out of phase I dont think would sound bad. Could you control the priority of 2 or 4 audio threads . How do DAWs manage swapping stuff into the live buffer ? I assume different tracks and audio effects uae different threads …
The most straightforward solution on something like Raspberry Pi would be to just run the different instruments in different processes (thus automatically in different threads/CPU cores) and let the operating system/sound system take care of how exactly that happens. With something like Jack or Pipewire you might even be able to set up more advanced routing.
Making multithreaded audio work in a single process yourself can get complicated even if you allow additional latency and CPU use. It’s a problem I’ve basically given up working on myself but I have been looking into the Tracktion Graph library that does have multithreaded options to play the graph. I have not yet tried the multithreading in that, though, because even single threaded, the library has some challenges with how it must be used.
Thanks Xeniakos nice to see your name and always helpful reaponse. I thought of trying to build for a Zynthian but thats all heresay because those cant be baught. I wish there was a simple affordable hardware box that Juce could target other than a computer.