I want to evaluate if I can use JUCE to achive a certain goal. I would start on WIndows.
I need to use a SDK / DLL from another maker. This component would be my source for audio (stream). The user needs to select an input (source like a channel) and then the VST should provide a stream of audio to the DAW (or which ever application uses the VST). I do not need an audio or midi input into the VST. I need the user interaction to select the source within the VST and then I would aquire the audio (stream) via the SDK / DLL. I assume I receive the audio into a buffer which I pass on via JUCE as output.
Is this possible with JUCE? Which example / tutorial would be a good starting point?
Sounds fairly possible but the exact use case is rare enough that I don’t think there’s a directly applicable example or tutorial around. Basically it would be just about calling your 3rd party library’s function to generate the audio and copying the results of that into the buffer you are given for your AudioProcessor subclass’s processBlock method. There are obviously lots of details involved which would depend on how exactly your 3rd party library works. Can you reveal which library it is you are going to use? Is it open source or is there at least a public header file/documentation available?
Using separate dlls from a VST or similar plugin is somewhat tricky but usually possible in some manner. If static linking or directly compiling the 3rd party code into the plugin is possible, that would be preferable.
The 3rd party code should be thread safe because many host applications will be calling multiple instances of the plugin from multiple threads. (Usually when the plugins are on different tracks in the host.) When I was doing a test plugin with the Steam Audio API, I discovered their code is not thread safe. I was able to work around that with a global mutex, but that is obviously a horrible solution.
Another thing to consider, if the library you are going to dynamically link is using juce itself, you will get into hells kitchen, if there is only a iota different in the juce versions that other dll and your vst is linked to.
It would be much easier, if that 3rd party would simply implement the functions necessary to be loaded as vst itself without an additional glue…
It is a closed source, but a free component which receives (and sends) media streams. The SDK is available on Windows (and Mac/Linux). I have a bunch of *.h files, some *.lib files, a runtime exe and a number of DLLs.
I would need to use / talk via their API to the backend to find out the available channels (user selects) and the receive the audio (stream).
Usually the DAW application itself would be doing that, so I am assuming this is some kind of a thing where the audio is coming from another computer/device via the network or something…? But if it is about trying to use the same audio interface on the computer as the host application, that could spell trouble. It is usually best to assume the host application has control over the audio and MIDI hardware and plugins shouldn’t try to interfere with that.
It’s totally separated from the recording Computer (well that’s the plan). It is just another audio source which shall be accessed via a VST which provides the sound for the DAW (“no strings attached”). The VST could be seen / treated like a software synthesizer producing noise / audio (no direct input via midi and control only in the GUI to select the channel / source).