AudioDeviceManager vs AudioAppComponent

If I’m creating a standalone app which one is recommended?
I have been using AudioDeviceManager with the audioDeviceIOCallback but is there a preferred way?

I ask the question because the AudioAppComponent schaffolds differently and has a differing callback method

AudioAppComponent is just a class that holds an AudioDeviceManager and an AudioSourcePlayer. It also inherits AudioSource and the class will use itself as the AudioSource for the AudioSourcePlayer. IMHO you don’t really get any particular benefits by using AudioAppComponent, it kind of looks like a class that is meant for tutorial projects. Proper projects will have to be dealing with the AudioDeviceManager anyway. (Which is exposed in the AudioAppComponent, though…)

1 Like

Cool that’s exactly what my gut was telling me.

Do you know if when creating a plugin you can override the standalone’s little options thing and use the AudioDeviceManger. It would be nice to be able to compile plugins and true standalone app from same code base.

Why would you want access to the AudioDeviceManager instance in the standalone app plugin builds? The standalone app plugin build already lets the user to set the options of the audio device. (edit : Anyway, with some additional work it’s possible to access the AudioDeviceManager instance. You can actually replace the whole stand alone application specific code.)

  1. I have need of multiple input output options and would like to control this.
  2. The options UI and bar across the top aint pretty :wink:

You can enable the JUCE_USE_CUSTOM_PLUGIN_STANDALONE_APP define and implement the juce_CreateApplication() stand alone function in your code to create your custom JUCEApplication subclass.

juce::JUCEApplicationBase* juce_CreateApplication()
{
	return new MyStandaloneFilterApp;
}

Of course implementing the application class from scratch involves a bit of work…(You can’t leverage for example AudioAppComponent for that, because that wants the DSP to be a subclass of AudioSource, not AudioProcessor, which the plugins use.)

Ah no that’s cool, think that’s exactly what I was after.

So then I could audioDeviceIOCallback to pass through to AudioProcessor process block and have all in same code base.

I’m assuming then I could copy most of the JUCEApplication main.cpp setup and dump the files in shared code?

You can look in the Juce sources how the default implementations for the stand alone app and other needed stuff work :

JUCE\modules\juce_audio_plugin_client\Standalone

The application part isn’t that bad, but the juce_StandaloneFilterWindow.h has quite a lot of code…I need to be doing some customization to that stuff myself, but haven’t yet gotten to it. (I want the standalone app build of the plugin to be able to offline render itself, and that is quite tricky to achieve without modifying how the standalone app wrapper works.)

Just a word of caution, @Xenakios hinted it but to make it crystal clear:

  • The AudioDeviceManager may refer to a different device than your host
  • The host might have the AudioIODevice exclusive
  • The processBlock calls and the calls from your AudioIODevice are separate and must not mix
  • There is no means to synchronise AudioIODevice and processBlock (just think of offline render etc.)

I used this setup once, when I had a library to audition separately and independent from the timeline in FinalCutProX.
But for generating the audio for the host, you can ONLY use processBlock

Right, but that would be only for when running your plugin as a standalone application. You can’t expect to be able to fiddle with the audio hardware when running as a plugin inside a host. (It may happen to work in some circumstances, but generally you shouldn’t expect to even to be able to do simple playback from your plugin via an AudioDeviceManager when running inside a host.) If you need generally configurable I/O for your plugin, it needs to be done in some other way, AudioDeviceManager wouldn’t really play into that. (Also the AudioDeviceManager instance wouldn’t even be available by default in the VST etc plugin builds, the code that has that is only compiled in for the standalone app version of the plugin.)

Wasn’t quite what I was getting at there’s 2 separate workflows I want to create one when it’s standalone and the other is a smaller portion as a plugin in a DAW. I’m not trying to access underlying hardware from within a plugin.

I’m just trying to share code in one project and use meta programming to to hit different targets at the same time.

Ok, just wanted to make sure, since several users were tempted to access the AudioIODevice from the plugin.

There exists a sibling for AudioSourcePlayer, which is the AudioProcessorPlayer.

An alternative to sharing the same project for a standalone and a plugin is, to put your code into a module, that you use in a Plugin project and in an Application project.