My app is an audio plugin that sits in the sequencer process and has it’s UI in a seperate executable, which means of course that the UI sees the plugin as an out-of-process dll. Now I have that fascist inter-process-messaging-groove-thang licked, but I’m a mite confused when it comes to the event-handling code in the juce framework. I need to bind my UI controls to parameters for the dll & vice-versa: my UI queries the dll for a list of supported in/out parameters and their types, max/min values etc. and should build a control or display component for each parameter, depending on it’s type. All this is hunky-dory, but I’m at a loss as to the best way to bind components to event handlers.
I’ve read elsewhere in this forum that it’s “good design” to use a seperate class to handle each event type, as opposed to a gigantic switch statement dispatching stuff back & forth to the UI, but it appears to me that the natural place to handle the events to & from the plugin is in the class that interfaces to the interprocess messaging code. Of course this class knows bugger all about the UI components, so I’m having trouble visualising how to get messages back and forth to the UI. A pointer from someone who has done this sortof thing would be much appreciated, because if I can’t sort this out soon I’m going to have to switch to something like fltk, or God forbid, wx!