Advice on Virtual Reality Using Juce and VST SDk 2.4


So I was wondering if anyone had any advice as to how to go about designing an overall application to take in head tracking data from VR reality glasses, send head tracking parameters to a VST plugin to change 3D orientation of a sound source, and also change the visual display of the glasses using the head trackign data.

I guess my real question is how to go about setting up the entire system. I need some sort of GUI for my application, not too complicated, just a couple of buttons and menu options to choose the correct visual/auditory test. Should I use the audioplugin project in the introjucer? Or the GUI?


I had a friend who did something similar with a wii remote using max msp. Instead of writing the auralisation algorithm, however, he interfaced max msp with FMOD API to characterise the sound in a virtual 3d environment. The position in 3D space was visible in a Max MSP OpenGL type window.

I’m not saying this is the easiest way, but if you could find out a way of piping data from max msp to the vst plugin I reckon it could be done!

If you’re designing an app then you don’t want to build an audio plugin …

I know that, I have a VST audio plugin that I want to use to create certain effects using my head tracking data parameters.

I was told that JUCE includes classes that deal with audio streaming/buffering nicely. I was wondering if there was an SDK for Juce.

More specifically what libraries, headers are needed to support audio plugin hosting. Also, if there is a graphical way to build the GUI’s. Reading through information about Juce makes it seem that there is a drag and drop approach to building GUI’s.