Communication between different Plugins within the same host

is there a way to get a communication of two or more different juce plugins when their dlls are loaded into the same host process?

I tried the singleton approach, but this only works when the two plugins are instances of the same dll. I need a communication between completely different plugins and not between instances of the same.

I was thinking about using “broadcastMessage” but this seems to be used for communication of different processes, only and not for communication within a process.

I’m trying to make one plugin read audio data from another one. I hope someone has had this problem before and there is an elegant way to do this.

Thanks for any help.

If an upstream plugin puts audio data in an output channel, it’ll be available in the input channel of a downstream plugin. By downstream I mean closer to the audio out. You could pass control information by inserting it into the MidiBuffer as controller or sys ex messages to be read by a downstream plugin.

you could use the InterprocessConnection classes?

Thanks for the hint, using the InterprocessConnection with a namedPipe works. Now I can send audio data from one plugin to a completely different one. Really nice!

To make this useful in practice I have two more questions:
Assuming there is a plugin A which takes the audiodata in its processBlock A and then sends this data to plugin B which needs this data at a specific point in its processBlock B. How can I synchronize this connection, to make sure the processBlock of plugin B (or at least the point where the transmitted audio data is needed) is called after plugin A was sending the data in its processBlock? Is there a way to make plugin B wait for a signal. I’m afraid that a waiting (however it is implemented) within a processBlock could cause some trouble.
What is the best way to deal with this synchronisation problem?

The second question is not that important, but I’d like to know if there is something like the InterprocessConnectionServer for NamedPipes, too? That would make it much easier to set up more complex connections. The InterprocessConnectionServer is great, but when I use sockets instead of NamedPipes, I have to deal with firewalls and some other stuff I’d like to avoid.


Oh wow, I thought you just meant sending general info between plugins, not between their realtime code! If you need stuff to get passed around in your processBlock, then you’d need to use shared memory and events, but even those would probably need to be done via a background thread to keep it running smoothly. That’s really quite a complex problem to implement!

not really, you can just use some global variables and some global functions for memory exchange. this works, because (at least for VST plugins on windows), the DLL is loaded only once, and everytime a new instance of the plugin is made, there’s a new object created for it, but global variables stay the same.

i did that in one of my vst plugins and it works quite well.

[quote=“zamrate”]not really, you can just use some global variables and some global functions for memory exchange. this works, because (at least for VST plugins on windows), the DLL is loaded only once, and everytime a new instance of the plugin is made, there’s a new object created for it, but global variables stay the same.

i did that in one of my vst plugins and it works quite well.[/quote]

The OP said

Yep, that’s the problem. The communication between two instances of the same plugin is pretty simple. I did that myself using a singleton class and it worked quite well.

I really need some kind of IPC between different plugins (different dlls). Right now, I’m at the point where I can pass AUDIO data from the processBlock of plugin A and plugin B receives that data in a callback routine. But there is no synchronisation implemented to ensure plugin B receives the data before it is needed in it’s processBlock.

So the question is, is it impossible to setup a synchronized communication of these two (or maybe more) realtime threads with namedPipes, even if I don’t use the message thread but the connection’s own thread for passing the data. What Jules said sounded like this to me.

If there is no way to do that with named pipes, what would be a good starting point to setup a platform independet communication using shared memory? These things are pretty new to me, but there is no way around to get my stuff working.
So any help is appreciated

I don’t think there is a reliable way to do this. Other plug-ins out there that do sidechaining using a two-plugin approach like you describe exhibit clear synchronization problems at high buffer sizes, or when tracks are in the “wrong” order (i.e. the send is to the right of the sidechained plug – a common limitation, but implementation-dependent). God forbid that you introduce hardware-based plug-ins or other sources of latency into the equation.

A clear illustration of why it can’t be made to work correctly is that certain hosts will stop calling process when there is no audio region active on a track. If this were to occur on the track where your sending plug is inserted, and your sidechained plug were to block waiting for a buffer of the key signal to arrive, then you’d be stuck. If you have n instances of such a plug-in, where n is the number of audio threads, then you can easily imagine a deadlock.

Basically, there are no guarantees about the order in which two plug-ins will be called (and the order might change from buffer to buffer), and furthermore, there is no guarantee that a particular plug-in will be called during a given buffer period.

I’m not sure, I think the synchronization stuff may be pretty complicated but perheps not impossible (I think I’m going to try it with shared memory using the boost c++ library). Your second point made me thinking…
I know there are hosts, that don’t call the processBlock, when no data is present, but there must be a way to disable that, otherwise these hosts wouldn’t support any reverb or delay plugins, or plugins that produce feedback and other stuff where no audio region is present.

I thought this could be done by setting the “JucePlugin_SilenceInProducesSilenceOut” flag to false but this doesn’t do anything at least for vst plugins even if it sets the noTail function of the vst sdk to false. I checked that with two host (samplitude and logic), that don’t call the processBlock if the input is zero. It made no difference at all.

How do other developers force these hosts to call the processBlock when no sound is at the input? If there’s an answer to this question the whole thing might be done.

I have a similar issue and I use a third ‘common’ dll that is linked into both the VST dlls. Singletons in the common dll are only instantiated once.

I also wrote a VST plugin a while ago (not using juce) that combined audio from multiple plugin instances running in the same host. I managed to get it to work (in Cubase at least) by using a singleton to monitor which order the plugins were actually being called and responding accordingly.

That sounds interesting, maybe I’ll give it a try, although meanwhile I got the shared memory stuff working, at least on win32. I’d like to know if you found a way with your setup to synchronize things properly. How did you manage to make sure that one plug is processed before another one. In case that this kind of sync is relevant for your plugs, any ideas regarding the sync stuff are appreciated.

Something to bear in mind is that hosts are starting to be more multithreaded these days - so on a multiprocessor machine you’d need to cope with your plugins being called at the same time…

when I did it it was on a single threaded version of cubase, so I didn’t encounter any thready issues. The algorithm was something like this:

each plugin had a parameter that the user could set to tell it which input pair it was feeding and another one for the output pair.

when the plugin recieved it’s processReplacing call it delivered its audio ins to the ‘master’ dll along with the current song time from the host and whatever its input/output parameters (mentioned above) were set to.

it then asked the master dll for it outputs, which it sent back to the host.

all processing was done in the master dll. this was able to tell from the current song time coming from each plugin, which order they were being called from the host. I had a switch which allowed you to choose between two different processing modes:

option 1) fast mode: assume that the ins and outs are coming in the right order and process them without any latency, or

option 2) safe mode: don’t assume anything about the order and use the inputs from the previous audio block to provide outputs for this audio block. obviously this introduces latency of 1 blocksize, but you should have all the data you need regardless of the order.

I suspect neither of these would work in Reaper, with it’s look-ahead processing etc. etc.

IMHO this sort of thing is a can of worms, but on the other hand, if it was easy everyone would be doing it, so don’t let that put you off.