Multithreaded architecture advice


#1

Hey! Remember me? I’m still enjoying the JUCE, though on Linux now a days. (Transitioning development to Linux turns out to be a piece of cake, much thanks to JUCE. GCC error messages could be less cryptic though)

I’m writing an app that will render audio in real time, much like a modest virtual studio.
So I have a gui component tree in the regular Juce style, and I’m thinking I should put the audio rendering in a thread of it’s own and up it’s priority a little. So far so good, but how do I best send data back and forth between the threads?

My current thought is to make a singleton data manager that collects the stuff from the gui and have the audio thread lock it once per buffer cycle or so, and copy the data into local vars for use in the DSP processes.

But I’ve never done anything like this before so I’d like to hear what others make of this before I start banging out the code.

Cheers


#2

you better not write audio in another thread than that and try to synchronize with it. just divide your classes in 2 parts. the core and the gui. the core work in your high priority audio thread and renders the audio. the gui just manipulates the core properties to change the audio in realtime (no need to lock, just only interpolation). if the core class should speak with the gui notifying of state changes or such, just send asyncronous messages queues that the gui will pick up (so no need of locking too).

remember that audio thread should be runned with higher priority and could be with realtime priority too. So any kind of locking should be minimized if not avoided at all.

cheers


#3

So, I can dispense with a lockable middle man altogether?

To make sure I get what you are saying, can I do something like this?

class AudioProcess : in-a-thread-of-its-own { 

private:
   float volumeVariable;

public: 
   setVolume(float whatever) {
      volumeVariable = whatever;
   }
}

//*******
class myGui {
   
   void mySliderMoved(){
      AudioProcess->setVolume(whatever);
   }

}

#4

of course. the only drawback will be that if your slider will jump from 0.0 to 1.0 you will hear a click in the audio cause the transition step is too high (other parameters should not suffer from this, but some do).
you only need to interpolate your volume value in your processing block… typically 64/128 samples are enough.


#5

Oh, ok. Thanks.
I thought that could cause deadlocks or something and that locking was needed to prevent both processes from accessing something at the same time. :smiley:


#6

no deadlocks is impossible (well is possible if you abuse locks!!!). at least you could have a segmentation fault when u do something like this:


class AudioProcess : in-a-thread-of-its-own {
public:
   void process () {
       for (int i = dspClass.size(); --i >= 0;)
            dspClass.getUnchecked (i)->processMyStuff();
   }

   void thrashDsp (const itn index)
   {
        dspClass.remove (i, true);
   }

private:
   OwnedArray<MyObject> dspClass;
}

//*******
class myGui {
   
   void mySliderMoved(){
      AudioProcess->thrashDsp (1);
   }

} 

so you could delete the object while it’s in use, or mess with indexes and array size.

you would rather put the object in a ignore list which will be pushed off from the processing array in the audio loop (after it finished processing), then the audio loop will sendMessage to the gui that the object requested is ready to be deallocated (of course you shouldn’t mess with memory allocations in audio thread!) cause it’s out of sight from the audio callback point of view…


#7

Ok. Thanks.

Another related thing that came up when I started banging this out: A Juce thread is supposed to do its business in the “run” method, but the audio interface is callback driven.

scratches head

So, what happens if I inherit from both Thread and AudioDeviceIOCallback. I must be getting something fundamentally wrong here I think…

(I shall have to cut down on my beer consumption. While coding anyway.)

edit:
I should add that my audio class works -derived from thread and audiodeviceIOcallback, with my white noise audioSource hooked to the actual callback.
But does this mean that it is actually running in a thread of it’s own now, even though “run()” is empty?


#8

yes totally wrong. you can, but they will behave different anyway. you should write you source and plug it into the device callback, any other thread is useless here


#9

hmm, are you saying that what I want isn’t possible then?

I.E, I want one thread/process to flow from the deviceIOcallback, wherein I do the DSP, and another that handles the main Juce event driven GUI stuff.


#10

Rock, I think you might be getting a bit muddled…

The audio callback happens on a thread that’s owned and managed by the audio device itself, so you don’t have to create one or do anything, just respond to the callbacks. The GUI thread always there anyway, so no need to create one of those either.

The only reason you might want your own thread would be for something like e.g. background caching of wave data, ready to get passed to the audio callback, but that’d be something you’d write specifically for a particular task.


#11

i don’t have said impossible.

what you mean by one thread to flow from the deviceIOcallback ?
the audioDevicecallback is just a function that will be executed in a thread. why the hell you want to make it inherit from thread also ?

what u say is true only if you want to manage yourself the low level interface with the audio driver (so actually you want to rewrite part of ASIO, dsound, jack, portaudio ?)


#12

[quote=“jules”]Rock, I think you might be getting a bit muddled…
[/quote]
No “might” about it! I’m a total noob to threading.

Ah, clarity at last. Cheers.


#13

[quote=“kraken”]i don’t have said impossible.

what you mean by one thread to flow from the deviceIOcallback ?
the audioDevicecallback is just a function that will be executed in a thread. why the hell you want to make it inherit from thread also ?
[/quote]
I didn’t know it was already in a thread of it’s own. I though the callback came from the main event system in JUCE so that it could be delayed by filling up the event cheque.

Maybe there should be a mention of this in the doc page for AudioDeviceIOCallback or something. Or is it just me?


#14

[quote=“Rock Hardbuns”]
Maybe there should be a mention of this in the doc page for AudioDeviceIOCallback or something. Or is it just me?[/quote]

that’s quite obvious, audio is realtime stuff, it can’t be runned from message queues, or we wouldn’t talk about latency, but Methuselatency :slight_smile:


#15

I don’t know, but I think Juce is rather unique in being set up for real time audio in this manner. Game Libs and such typically are not, in my experience, they also tend to be procedural as opposed to OO too of course.

One little sentence could have kept me out of the thorny shrubs of conceptual confusion. Just sayin’. :wink:

Anyway. All is well. Thanks.


#16

from what i know, apart from OpenAL, ALL other audio libraries/drivers interface out there are callback driven (managing for you the realtime audio thread). juce is no different from that :slight_smile:
after all, the juce’s AudioIODeviceCallback class is just a callback function packaged in a object… not more than that !


#17