I’m absolutely new to this forum and community, and it’s been many, many years since I last opened Visual Studio and XCode. But the prospect of learning Tracktion engine and JUCE to develop my own custom, ideal DAW has motivated me in wanting to get back into
I know I need to learn to crawl before even running and doing advanced Gymnastics, but my fundamental goal is take the feature set in Tracktion engine (which alone is amazing) and to create a customized version of embedded Subprojects like with Reaper (or something like Moto’s digital performer has)
I know that in reaper, a subproject is linked from the main project through a specialized audio rendering (the file format is a .RPP-PROXY) which sits as media clip no different than an audio clip, but you’re able to double click it and edit the file.
There’s much more than this happening such as syncing the master project to the subproject while you have it open, but I’ll cross that bridge when needed.
So, what I need to know is: what sort of design pattern would this be called, and what JUCE objects/library would this entail? I’m sure it’s just a matter of taking an audio rendering object of some sort, but with properties that point to a seperate project files with the functionality to double click the rendered audio to open it. But again, I just too need to understand if there’s a specific means to extend the audio rendering JUCE object in this fashion, of there’s another means I need to look into
