I’m currently in the process of building my own DAW. I’ve already built the needed UI, but now it’s time for the real deal, audio playback.
Essentially I have a project, which contains tracks, which contains clips, which contains audio samples.
I want the user to be able to play such projects to the output device, but I’m quite lost on how to do it.
I’ve read about the AudioProcessor, AudioProcessorGraph, and many more but I don’t really see how they will fit.
In the future, I want the users to be able to create busses and so on, but that’s a whole other question.
If anyone has any suggestions/ideas/solutions, it would be super appreciated
I would also suggest going with TracktionEngine. TracktionEngine is specifically written to be the foundation for a DAW. And, importantly, literally thousands of issues have been addressed in TracktionEngine that you would otherwise have to spend a lot of time dealing with.
For example, I started writing my DAW in JUCE. In several months I had just rudimentrary record/playback functionality, and no MIDI. I switched to TracktionEngine, and in a years time I have a fully functional DAW.
The learning curve is a bit steep for TracktionEngine, but worth it in the end.
Thanks for both replies!
I forgot to mention that I’m interested in building the DAW from scratch, so Tracktion is unfortunately not an option…
As for @baramgb’s blog reference, it was really helpful. I dug even deeper into the processors’ classes, and it seems like all of it’s bus-functionality is what I need. The documentation for it is a bit cryptic for me, so if anyone can explain it, it’ll be awesome!
This project is for education, not profit. I’m taking part in a 3-year program, which during the duration of its third year you build a whole project from scratch. But thanks for the warning