Another "newbie" question - Processor to Editor communication


Trying to wrap my arms around how plugins communicate back and forth between the Processor and Editor sides.

On the Processor side we create a “link” to the Processor with “MyPluginAudioProcessor& processor” that allows the Editor to access and update the Processor. Now, what is providing the link in the OTHER direction, i.e., how does the Processor send changes back to the Editor (to update the GUI elements)?


The recommended approach is to not explicitly “send” anything into the GUI editor from the audio processor. That’s kind of hinted at by the fact that there is no direct way provided by JUCE to get access to the editor from the audio processor code. It is possible to do (by casting the active editor pointer to its concrete type), but almost always a bad idea. (For example because many of the GUI object methods can not be called from the audio processing thread.) Obviously in some special cases some direct communication needs to be achieved. But do you currently have any actual reason to do that?

The usual approach is to have a timer in the GUI editor that polls for changes in the audio processor and acts accordingly.


That explains why I can’t find it!

OK, any simple examples you could point me to on how to do this? I’m trying to understand the “flow chart” with the code required to make it work.

Thank you!


The AudioPluginDemo that is in the JUCE sources uses the Timer technique to update the time position display in the GUI. (The editor class inherits Timer and the overridden timerCallback causes the label to update with the information from the audioprocessor.)


Thank you I will study that example. HOWEVER, I’ve run into a problem with the Juce examples before in that they are not set up the same way Projucer sets them up. The demos have the Processor and Editor all in one file whereas Projucer sets up the project with separate Processor and Editor files. That means a BUNCH of required cross-communication code is not shown in the examples - which is where I need the help.


The code is the same regardless of the 1 header file or 2 headers and 2 source files approach used. The AudioPluginDemo code has the oddity that the GUI editor class is nested inside the AudioProcessor code, but that doesn’t change anything either. (It’s just about how the classes are declared, but they would still be used the same way in the Projucer generated projects that have the 2 header files and 2 source files.)

The reason you are not seeing much communication code in the examples may be because they are written in a way where that isn’t needed. For example the AudioPluginDemo uses AudioProcessorValueTreeState and AudioProcessorValueTreeState::SliderAttachment to get the sliders to change the AudioProcessor parameters. They also automatically change their positions if the host automates the parameters, thanks to the attachments.

It may be none of the Juce provided examples no longer use the old fashioned way to deal with the parameters and the GUI components, where one would explicitly need to set up slider listeners to change the audio processor parameters and use a Timer to update the slider positions when the parameters are automated.

The Projucer generated plugin projects naturally don’t have any processor<->GUI communication code, apart from passing the reference of the processor object for the GUI editor. It’s up to you to add parameters into your plugin processor object and come up with some way to connect those to the GUI. AudioProcessorValueTreeState and the component attachments are probably the easiest way to do it currently.