Help turning AudioProcessorGraph into a standalone plugin

My plugin has an AudioProcessorGraph, and I’m trying to turn it into a stadalone plugin. I’m following the Cascading plug-in tutorial, and I feel like I’m almost there, but I’m hitting an assertion that I can’t seem to figure out.

In my JUCEApplication class, I have the following member variables:

    juce::StandalonePluginHolder pluginHolder;
    juce::AudioDeviceManager manager;

and in the constructor, I’m running the following:

MpApp::MyApp() :
    pluginHolder(nullptr)
{
    juce::PluginHostType::jucePlugInClientCurrentWrapperType = juce::AudioProcessor::wrapperType_Standalone;

    const auto inputDevice = juce::MidiInput::getDefaultDevice();
    const auto outputDevice = juce::MidiOutput::getDefaultDevice();

    manager.initialiseWithDefaultDevices(2, 2);
    manager.addAudioCallback(&pluginHolder.player);
    manager.setMidiInputDeviceEnabled(inputDevice.identifier, true);
    manager.addMidiInputDeviceCallback(inputDevice.identifier, &pluginHolder.player);
    manager.setDefaultMidiOutputDevice(outputDevice.identifier);
    pluginHolder.player.setProcessor(&getAudioGraph().audioProcessorGraph);
}

This is almost identical to the code in the tutorial. The only difference is that I’m using the AudioProcessorPlayer that belongs to the StandalonePluginHolder, rather than my own.

I’m hitting an assertion at line 245 of AudioProcessorPlayer:

void AudioProcessorPlayer::audioDeviceIOCallback (
...
// These should have been prepared by audioDeviceAboutToStart()...
jassert (sampleRate > 0 && blockSize > 0);

This seems like it should be straightforward enough, but I can’t seem to track it down. I’ve checked the prepareToPlay() functions and they are getting called with the correct, non-zero sample rate and block size.

Can anyone help?

Have you checked the AudioPluginHost? That’s a standalone app that contains an AudioProcessorGraph.

If you’ve already wrapped the AudioProcessorGraph into your own AudioProcessor, then the normal Standalone wrapper should work directly.

Thanks for the reply. I am looking through the AudioPluginHost code, but there’s a lot there that I haven’t fully understood yet. But if I can just follow up on your suggestion, then I should be able to get things straight.

How exactly do I wrap an AudioProcessorGraph in my own AudioProcessor? That’s what I was trying to do originally, but I couldn’t get it to work. I had a main AudioProcessor with an AudioProcessorGraph member, which I called in the processBlock(). It looked something like this:

void MainAudioProcessor::processBlock(audio, midi)
{
    ....
    audioProcessorGraph.processBlock(audio, midi);
}

But I wasn’t getting any sound out of this, so I looked in the docs and it said I needed an AudioProcessorPlayer to play the AudioProcessorGraph. Do I need the Player or not? If so, how do I set it up, and if not, how do I play the graph?

You have a couple of options:

  • The AudioProcessorGraph is already an AudioProcessor. Perhaps you can derive from it, and override any functions that need to behave differently, e.g. to create a custom editor.
  • If you want the Graph as a member of another processor, you’ll need to pass through most of the function calls to the graph. The most important calls are processBlock and prepareToPlay, but you’ll also need to make sure that things like querying the number of audio buses, setting the graph state, setting single- or double-precision processing and so on are passed through.

You could take a look at InternalPlugin in InternalPlugins.cpp to see one technique for wrapping an AudioProcessor (the example uses AudioPluginInstance, but that’s just a specialised AudioProcessor).

Thanks @reuk, I’d missed some of those overrides. Got them forwarding correctly now. I’m still not getting any sound, but i think it’s something really basic. Inside processBlock(), the AudioBuffer has numChannels = 0. Is there a setting i can change that will fix this?

Edit: in the constructor, I’m already doing BusesProperties().withInput(stereo).withOutput(stereo).

It’s difficult to say. Make sure that you’re passing a sensible default layout to the AudioProcessor base class constructor, and that you’ve implemented isBusesLayoutSupported. You also need to make sure that the wrapped Graph contains AudioGraphIOProcessor nodes for audio input and output. Finally, make sure that the Graph has the expected buses layout, which you can do with setPlayConfigDetails or setBusesLayout.

1 Like

Thanks reuk, it’s AudioGraphIOProcessor that I was missing! Seems really obvious now that I think about it.

Can I suggest that you consider adding a small note to the “detailed description” section of AudioProcessorGraph mentioning AudioGraphIOProcessor? It might save some other novice like me a lot of time. Maybe something like this:

Processors can be added to the graph as “nodes” using addNode(), and once added, you can connect any of their input or output channels to other nodes using addConnection(). AudioGraphIOProcessor nodes are used to control input and output channels for the whole graph.

@reuk I’ve been trying for days and I still haven’t got this to work! Here’s my current situation:

I have a class AudioProcessorWrapper which forwards all of the virtual functions from AudioProcessor (including isBusesLayoutSupported()). It’s basically just a copy of InternalPlugin, like you suggested. A new plugin gets wrapped in this class and placed in the graph. In the processBlock() function, if I fill the buffer with static, then I can hear it on my speakers, so I know that I’ve got all the graph connections and i/o stuff right. But if I forward the processBlock() call to the VST3 plugin instance as intended, I get silence. (If I load the same plugin into AudioPluginHost, I can get it to play, so I know that the problem isn’t with the plugin).

This all suggests that there is something wrong with the buses layout, but I have based all of the initialization stuff directly from InternalPlugin, and I can’t see what I’m doing wrong. Here is the constructor for my wrapper class:

    AudioProcessorWrapper::AudioProcessorWrapper(std::unique_ptr<juce::AudioPluginInstance> plugin) :

        juce::AudioProcessor(BusesProperties()
            .withInput("Input", juce::AudioChannelSet::stereo())
            .withOutput("Output", juce::AudioChannelSet::stereo())),  // not sure whether I need this line, but I've tried it with and without and still don't get any sound
        
		pluginInstance(std::move(plugin))
    {

        const auto matchChannels = [this](const bool isInput)
        {
            const auto inBuses = pluginInstance->getBusCount(isInput);

            while (getBusCount(isInput) < inBuses)
                addBus(isInput);

            while (inBuses < getBusCount(isInput))
                removeBus(isInput);
        };

        if (pluginInstance)
        {
            for (auto isInput : { true, false })
                matchChannels(isInput);

            setBusesLayout(pluginInstance->getBusesLayout());
        }

        enableAllBuses();
    }

The last thing you mentioned was to ensure that the Graph itself has the right buses layout. I wasn’t sure exactly how to do this, but I did clobber something together just to test it (something like parentGraph->setBusesLayout(getBusesLayout()) in the constructor). Still no sound. But I must be getting something wrong with the buses somewhere, or else I am not initializing the pluginInstance correctly. Can you think what it could be?

I’m still stuck on this—it’s one of the most frustrating brick walls I’ve ever faced. I’ve double checked every line of my code, re-read all of the relevant tutorials, but I still can’t get any sound out of my plugin. Here’s what I know:

  • I’ve build my own sine-synth plugin and loaded it into my plugin wrapper. It works, and I get sound.
  • I’ve loaded some VST3s into the AudioPluginHost and they work.
  • When I load the same VST3s into my same plugin wrapper and connect it to the graph, I don’t get any sound.
  • I’ve stepped through every line of AudioProcessorGraph code, comparing the working AudioPluginHost to my own code, and I can’t make out the difference. Midi messages go in, silence comes out.
  • I don’t understand buses that well so the problem still could be with that, but I’ve tried every option I can think of, based on the tutorials and sample code.

I’m wondering if it’s a problem with the way that my main AudioProcessor is setup, with the compile settings, or with something else that is inhibiting the sound. I’ve tried it with producesMidi() set both ways, and I’ve compile with IS_SYNTH to see if it would make any difference.

Can anyone think of anything I might be missing? Or suggest strategies for tracking down the problem?

Have you definitely called enableAllBuses() on the inner plugin?

Yes, I’ve definitely called this function on the inner plugin, right after I’ve added new channels (as in the code above), and I’ve stepped through to make sure that it’s doing what it’s supposed to.

I don’t know if this suggestion is pertinent, but why “turning” instead of creating a new plugin/standalone with Projucer and adding your code to the project? This will take a while to adapt, but everything will be cleaner and better for future updates.

The generated code will provide you with a Processor class with the method of processing blocks, and an Editor class in which to add the components, all ready to run both in plugin and standalone.

Thanks @Marcusonic, yes , I’m certainly thinking about this. But it’s a big project and it would probably take weeks to migrate. I feel like there’s probably just one or two small things that I’m missing that are preventing it from working, and hence if I push forward I’ll find the answer more quickly. Perhaps this is the sunken cost fallacy though.