Hello everyone! I’m approaching the GUI coding stage of my latest plugin project, and I have some concerns about the general architecture and class layout.
I would really like to be able to someday take just the GUI of my plugin and make a standalone app out of it, not connected to an AudioProcessor – essentially so that I can create a “remote control” app for other instances of my plugin.
I’ve created a Component-derived class
ImogenGui, which is the top-level component representing the entire plugin GUI and containing all the child components, and my Editor just contains one of these objects.
The design question this brings up is, what’s the best way to manage the connection between the Processor and Editor, such that I can take the GUI and essentially “wrap” it in some other code to transmit parameter changes as OSC messages, instead of relaying them directly to an AudioProcessor object?
My first instinct was to create an abstract interface class
ImogenGuiHandle, so that the
ImogenGui can have a consistent API for getting and sending updates. In the plugin build, the Editor would inherit from
ImogenGuiHandle, and for the standalone remote app version, I would create a wrapper version of
ImogenGuiHandle that transmits all parameter changes to another instance of Imogen via OSC or some other networking.
My concern with this is that it would seem this precludes me from utilizing any of the
AudioProcessorValueTreeState listeners/attachments… I would most likely need to implement individual functions for each
Has anyone done anything like this before? Does anyone have any suggestions, or words of warning?