Pro Tools - my plugin (mono) always shows maximum level on volume indicator on GUI, but (mono/stereo) version is correct, why?

edit: (mono/stereo) plugin now shows up in Pro Tools, but volume meter shows max volume level on GUI. (see last post in this thread)


I’m testing my plugin in a few DAWs and in Pro Tools its showing up but only as (mono), and there’s not another entry that says (mono-stereo) like there are for many other plugins.

I saw this code from @HowardAntares in a post a few years ago. Is it still the best approach for solving this Pro Tools issue.
I read other people changing the “Plugin Channels Configuration” in the Projucer. Or should both be done these days?

This just informs the DAW that my plugin can handle mono and stereo, right? It doesn’t actually change how my plugin’s processBlock() processes audio, does it.

if (layouts.getMainOutputChannelSet() == AudioChannelSet::mono())
{
	// Mono-to-mono
	if (layouts.getMainInputChannelSet() == AudioChannelSet::mono())
		return true;
}
else if (layouts.getMainOutputChannelSet() == AudioChannelSet::stereo())
{
	// Mono-to-stereo OR stereo-to-stereo
	if ((layouts.getMainInputChannelSet() == AudioChannelSet::mono()) ||
			(layouts.getMainInputChannelSet() == AudioChannelSet::stereo()))
		return true;
}

return false;

image
{1,1},{1,2},{2,2} ?

If you’re using that function, then you should leave the Projucer field blank, not do both.

And yes, that code just tells the host that it will accept the given configuration or not. You’ll need to query the actual layout to know what the host sets up for your plugin. We use prepareToPlay() as the location to query that, since that’s the best place to (re)allocate resources and define internal routing.

1 Like

Something like this in prepareToPlay()
My plugin just simply applies a distortion to the audio signal channel by channel, there’s no different handling between L and R.
I don’t suppose I need to make any changes really in my processBlock(), …?

prepareToPlay()

    auto currentLayout = getBusesLayout();
    auto mainInputLayout = currentLayout.getMainInputChannelSet();
    auto mainOutputLayout = currentLayout.getMainOutputChannelSet();

Not sure how you later use those variables, but in our prepareToPlay, we use those calls to set an internal hasStereoIn and hasStereoOut variables, which is used in processBlock to know whether to read and write to a second channel or not. We don’t want to use the second channel if it’s not stereo.

Also, note that in AAX you need to assign IDs for each layout you support, via getAAXPluginIDForMainBusConfig(), and if you have a page table xml file, they will need to have matching entries there for each layout you support.

The changes made in .isBusesLayoutSupported() meant that my plugin does now show up (mono) and (mono/stereo). :ballot_box_with_check:
re: getAAXPluginIDForMainBusConfig()
I just saw this thread… is something else used in its place now? AAX plugins are silently broken

That must be in a later version than we’re using (7.0.5). It’s not deprecated using this version. Something to keep an eye out for next time we update our Juce submodule, I guess.

I inserted my plugin (mono/stereo) version in Pro Tools, and it looks OK when invoked. However, I insert the (mono) version and the volume needle on the interface jumps to maximum by default! No audio is playing, there is no audio on the track.

I then insert test audio and its processed the same in (mono) and (mono/stereo) versions, its just the volume meter on the GUI issue with the (mono) version.

Any idea where I could start exploring to remedy this?

Hi!
No way around actively debugging it. Here are some suggestions:

  • Build a version that clears the entire buffer in processBlock, instead of running your processing code. observe if the issue still occurs. if it doesn’t, you’re writing garbage into the buffer somewhere.
  • Use the developer version of pro tools, make a debug build of your plugin, attach to PT with a breakpoint in processBlock. Instantiate the plugin (mono version). Investigate the path your code takes through processBlock, look at the data you fill into the buffer. Any chance there’s a denormal in it? -INF in any sample?
1 Like

To test it I cleared the buffers coming in, right at the start of the processBlock()

for (int channel = 0; channel < buffer.getNumChannels(); ++channel)
    {  buffer.clear(channel, 0, buffer.getNumSamples()); }

But it still spiked my volume needle. So was wondering why it would do that since the buffers contain zeros.
Then, trying something different, at the end of the processBlock() where I calculate the volume level I set it to 0.0f

    double rmsLevelL = 0.0;
    double rmsLevelR = 0.0;

And then the needle was where it should be.

This is perplexing to me. Why would my needle be maxed out if the buffers are filled with zeros, and why would the (mono/stereo) one work as it should but the (mono) not. Same build, same code.

Is there something with routing, buffers and buses that I’m not understanding.

Can you show more of your processBlock()?
Is there any other code except clearing it?

And are you implementing the outputMeter parameter and not setting it by any chance?

So to clarify, is it the meter of the channel strip or the one in the mixer view next to the plugin?

1 Like

The volume meter is my own on the GUI, its an analog style one. RMS calculated.

    rmsLevelLeft.skip(buffer.getNumSamples());
    rmsLevelRight.skip(buffer.getNumSamples());
    const auto rmsLevelL = calculateRMSLevel(buffer, 0) * sqrt(2.0f);
    const auto rmsLevelR = calculateRMSLevel(buffer, 1) * sqrt(2.0f);
    const auto volume_valueL = juce::Decibels::gainToDecibels(rmsLevelL);
    const auto volume_valueR = juce::Decibels::gainToDecibels(rmsLevelR);
    rmsLevelLeft.getCurrentValue() > volume_valueL ? rmsLevelLeft.setTargetValue(volume_valueL) : rmsLevelLeft.setCurrentAndTargetValue(volume_valueL);
    rmsLevelRight.getCurrentValue() > volume_valueR ? rmsLevelRight.setTargetValue(volume_valueR) : rmsLevelRight.setCurrentAndTargetValue(volume_valueR);

I’m getting a failure here, does this shed any light ?

JUCE Assertion failure in juce_AudioSampleBuffer.h:255 A breakpoint instruction (__debugbreak() statement or a similar call) was executed in ProTools.exe.

    const Type* getReadPointer (int channelNumber) const noexcept
    {
        jassert (isPositiveAndBelow (channelNumber, numChannels));
        return channels[channelNumber];
    }

line 255 is the channelNumber, numChannels one

edit: I think its trying to access a channel in the buffer that doesn’t exist… I’m investigating that.

    for (auto i = totalNumInputChannels; i < totalNumOutputChannels; ++i)
        buffer.clear(i, 0, buffer.getNumSamples());
(mono) totalNumInputChannels = 1, totalNumOutputChannels = 1                //fails
(mono/stereo) totalNumInputChannels = 1, totalNumOutputChannels = 2     //works

The assert says channelNum needs to be positive and less than numChannels. Look up the stack to figure out why it’s not.

1 Like

SOLUTION:
It was my volume calculations. They assumed there was more than one channel. buffer, 1 was trying to access a non-existing channel.
This fix worked.

    if (buffer.getNumChannels() > 0) {
        rmsLevelL = calculateRMSLevel(buffer, 0) * sqrt(2.0f);  }

    if (buffer.getNumChannels() > 1) {
        rmsLevelR = calculateRMSLevel(buffer, 1) * sqrt(2.0f);  }
    else {
        rmsLevelR = rmsLevelL;  } // If there's only one channel, use the same value for both L and R
1 Like