How to combine multiple project types? (OpenGL, Audio)

I want to create an audio player and music visualizer, with basic DAW elements (audio / MIDI import, transport controls, playhead) and with dense graphics (just like a music visualizer in iTunes / winamp.)

  1. Which project type should I get started with in Projucer? Audio? OpenGL? Animation??

  2. And then how do I merge them? Do I just add “Modules” from one type to the other? (I couldn’t find any information on merging project types in the Tutorial)

You can’t “merge” project types but they don’t matter that much anyway. They are just project templates that have some particular things already set up in the autogenerated code. (Console application, Audio plugin, static library and dynamic library are the special cases that you don’t want to be using when doing a standalone GUI application.)

Just create for example an audio application in Projucer and ensure all the modules you need are added. You can manually add and use whatever objects are needed in the code as long as the appropriate Juce modules have been configured to be used.

Thanks for your help.
So, if I start with an Audio Application, which modules do I add to bring in OpenGL?

I created one of each to compare their modules. I see that the Projucer-generated Audio project already contains all of the modules that OpenGL contains (as displayed by the Modules section in Projucer). Does this mean I do not have to do anything to get my Audio application to also work with OpenGL graphics?

Thanks again

I am bumping this to see if there are any better examples out there. I’ve been trying for far to long now to try to get a simple OpenGL application to just load and play an audio file. I can see doing this either way: create a JUCE OpenGL app and attach an AudioPlayer component, or create an AudioPlayer app and add an OpenGL component. I can do the first part of either, but I cannot find any examples of how to merge the two. It seem crazy that such a sophisticated platform as JUCE doesn’t make this obvious. Why can’t there be a single class to derive both functions from? I’ve scoured the tutorials and forums and such and only found basic steps like xenakios mentioned above. But no actual examples.

I’d really appreciate any help. Frankly, I’m just about to give up on JUCE. I love the idea of such a powerful, multi-platform system for audio. But I’m starting to figure out it is not worth it. I might just have to revert back to DirectX to do simple load/parse/play audio. Please help!!!