Hi all,
I’ve been exploring JUCE and audio dev for the past 3-4 months, with the goal of getting a good overall grasp of the different tech and complexity involved in developing audio plugins. A big pain point for me is still UI and frontend dev though, because compared to DSP and audio-related things there seems to be many equally valid ways of doing things, especially with webviews joining JUCE (and other new stuff like Cmajor entirely relying on them). I’ve found myself spending a lot of time checking out frontend frameworks and thinking “would that pay off better to learn that one thing in the long run compared to that other thing?” (note that the JUCE UI tools count as a “frontend framework” for me)
I guess having learned about multiple tech and practices in the field will pay off in the long run because I know better about how rendering works, what performs well and what doesn’t, etc. But for now I’m considering going the completely opposite way with this and trying to code as much as possible without any UI. I checked out Vital code from Matt Tytel, and I noticed there is a “headless” build which seemed crazy to me at first. Until now my “headless” builds have been plugins which expose their parameters to the DAW, but some functionality doesn’t translate this way (like assigning modulators to parameters): do a lot of you build command-line tools for plugin testing, or do you rather build mockups with JUCE components that “do the job” until you consider the UI layout and aesthetics in a final stage?
