AU parameter automation issue - ToggleButton does not write

The setup: Audio plug-in with parameters managed by APVTS. GUI controls include Sliders and ToggleButtons, connected with SliderAttachment and ButtonAttachment classes respectively. Basically follows the practices outlined in the APVTS tutorial.

The issue: Automation is not recorded for the ToggleButtons when clicking them in the GUI. This problem happens only when the plug-in is used in the AudioUnit format, and then only in Ableton (9 & 10) and Logic. It works fine in Reaper.

Details from Ableton testing: When clicking a ToggleButton in the GUI, the automation lane will switch to display the correct parameter - just nothing gets recorded. If just a single point of automation is drawn in with the mouse (in that lane), it will then properly record further GUI gestures as automation (for that lane).

Following up on ideas in a couple possibly related threads, I verified that:

  1. When clicking a ToggleButton, the AudioProcessorParameter methods beginChangeGesture, setValueNotifyingHost, and endChangeGesture are being called (in that order).

  2. That the parameter properties that the DAW might care about are set correctly, e.g. some sample DBG output:
    getNumSteps = 2
    isDiscrete = yes
    isBoolean = yes

Slider automation works fine everywhere.

???

I think that’s purely an Ableton issue. For some reason, it doesn’t record unless there is already data in the automation lane. I don’t think the plugin can do anything about this.

It seems to record Slider automation fine into an empty automation lane, however. This issue is only happening with parameters bound to ToggleButton components.

Hmmm, actually, it may happen with the 3rd type of attachment class as well, ComboBoxAttachment. My plug-in doesn’t use any of those, but in testing the issue with another JUCE-made plug-in (not naming names!), I am seeing the same issue with their combo boxes. However, since it’s not my plug-in, I don’t actually know if it was coded using Attachment classes at all.

I just tested on spacecraft desktop AU which is currently in beta.

I can confirm that in Ableton 10 neither the combobox or (custom*) toggle buttons record as automation in the AU. The automation is recorded as expected when using VST in Ableton however.

Haha yes Reaper for some reason seems to be indestructible when it comes to automation. I often find situations where everything works in reaper and then breaks in other daws (Ableton, I’m looking at you!) which is why I try to avoid testing in reaper these days.

*Note that I’m not using attachment classes for comboboxes and I created my own custom class for the toggle buttons (don’t ask). I’ve been relying on manually calling beginChangeGesture, setValueNotifyingHost and endChangeGesture. This seems to record properly in VST all cases but not in AU for some DAWs. I was thinking that this could be fixed by changing my code to use attachments but this post is making me question that idea. Shouldn’t it just work with the change gestures alone?

All other normal attachment-based sliders and custom sliders (again, don’t ask) record ok with automation in all hosts for AU and VST as far as I can see.

Thanks for looking into this, @MarkWatt

Agreed about Reaper’s robustness…great quality for a DAW, but not so useful as a testing environment for that same reason.

Are you able to test in Logic Pro? I got a report that this is happening with AUs in Logic as well. I don’t have that installed here so haven’t been able to test it myself yet.

I’ve just hit this exact problem, and I’m testing in Logic Pro. I’ve found that the toggle state of a button is stored and retrieved successfully when I save a preset, but no automation is written in Latch mode when I toggle a button (or, for that matter, change the parameter directly.)

If I manually draw in the automation in Logic, the buttons DO respond to it.

My toggle buttons record their automation fine in Logic. Not sure why you’re not seeing that happen. Are your parameters set as discrete and boolean?

Hello @lordchilli and welcome to the forum. Thanks for weighing in about this issue. In your plug-in, are you using APVTS for parameter management, along with the Attachment Classes to link GUI components to your parameters?

Yes, same here in Ableton, and that is also the report I got from a beta tester about Logic.

Hi Leigh - yes. I’m using APVTS. I’ve tried creating a completely pared down bit of code with just a button and a slider, and I get exactly the same problem. In fact, using beginGesture / setValueNotifyingHost / endGesture doesn’t record automation either, the only thing I’m having any luck with is SliderAttachement.

Here’s my test code:

#pragma once

#include “…/JuceLibraryCode/JuceHeader.h”

#include “PluginProcessor.h”

//==============================================================================

/**

*/
typedef AudioProcessorValueTreeState::SliderAttachment SliderAttachment;
typedef AudioProcessorValueTreeState::ButtonAttachment ButtonAttachment;

class AutomationTestAudioProcessorEditor : public AudioProcessorEditor, Button::Listener
{
public:
AutomationTestAudioProcessorEditor (AutomationTestAudioProcessor&);
~AutomationTestAudioProcessorEditor();

//==============================================================================
void paint (Graphics&) override;
void resized() override;

void buttonClicked(Button* aButton) override;

TextButton myButton;
std::unique_ptr<ButtonAttachment> myButtonAttachment;

Slider mySlider;
std::unique_ptr<SliderAttachment> mySliderAttachment;

private:
// This reference is provided as a quick way for your editor to
// access the processor object that created it.
AutomationTestAudioProcessor& processor;

JUCE_DECLARE_NON_COPYABLE_WITH_LEAK_DETECTOR (AutomationTestAudioProcessorEditor)

};

#include “PluginProcessor.h”
#include “PluginEditor.h”

//==============================================================================
AutomationTestAudioProcessorEditor::AutomationTestAudioProcessorEditor (AutomationTestAudioProcessor& p)
: AudioProcessorEditor (&p), processor §
{
// Make sure that before the constructor has finished, you’ve set the
// editor’s size to whatever you need it to be.
setSize (400, 300);

getLookAndFeel().setColour(TextButton::buttonOnColourId,Colours::darkgreen);
getLookAndFeel().setColour(TextButton::buttonColourId,Colours::transparentBlack);


addAndMakeVisible(myButton);
myButton.setButtonText("Do thing");
myButton.setClickingTogglesState(true);
myButton.addListener(this);
myButtonAttachment.reset(new ButtonAttachment (processor.parameters, "MyButtonStatus", myButton));

addAndMakeVisible(mySlider);
mySliderAttachment.reset(new SliderAttachment (processor.parameters, "MyValue", mySlider));

}

AutomationTestAudioProcessorEditor::~AutomationTestAudioProcessorEditor()
{
}

//==============================================================================
void AutomationTestAudioProcessorEditor::paint (Graphics& g)
{
// (Our component is opaque, so we must completely fill the background with a solid colour)
g.fillAll (getLookAndFeel().findColour (ResizableWindow::backgroundColourId));

g.setColour (Colours::white);
g.setFont (15.0f);
g.drawFittedText ("Hello World!", getLocalBounds(), Justification::centred, 1);

}

void AutomationTestAudioProcessorEditor::resized()
{
// This is generally where you’ll want to lay out the positions of any
// subcomponents in your editor…

auto r = getLocalBounds();

r.removeFromTop(10);

myButton.setBounds(r.removeFromTop(40));

mySlider.setBounds(r.removeFromTop(40));

}

void AutomationTestAudioProcessorEditor::buttonClicked(Button* aButton)
{
processor.randomiseValue();
}

#pragma once

#include “…/JuceLibraryCode/JuceHeader.h”

//==============================================================================
/**
*/
class AutomationTestAudioProcessor : public AudioProcessor
{
public:
//==============================================================================
AutomationTestAudioProcessor();
~AutomationTestAudioProcessor();

//==============================================================================
void prepareToPlay (double sampleRate, int samplesPerBlock) override;
void releaseResources() override;

#ifndef JucePlugin_PreferredChannelConfigurations
bool isBusesLayoutSupported (const BusesLayout& layouts) const override;
#endif

void processBlock (AudioBuffer<float>&, MidiBuffer&) override;

//==============================================================================
AudioProcessorEditor* createEditor() override;
bool hasEditor() const override;

//==============================================================================
const String getName() const override;

bool acceptsMidi() const override;
bool producesMidi() const override;
bool isMidiEffect() const override;
double getTailLengthSeconds() const override;

//==============================================================================
int getNumPrograms() override;
int getCurrentProgram() override;
void setCurrentProgram (int index) override;
const String getProgramName (int index) override;
void changeProgramName (int index, const String& newName) override;

//==============================================================================
void getStateInformation (MemoryBlock& destData) override;
void setStateInformation (const void* data, int sizeInBytes) override;

AudioProcessorValueTreeState parameters;

void randomiseValue();

Random random;

private:
//==============================================================================
JUCE_DECLARE_NON_COPYABLE_WITH_LEAK_DETECTOR (AutomationTestAudioProcessor)
};

//==============================================================================
AutomationTestAudioProcessor::AutomationTestAudioProcessor()
#ifndef JucePlugin_PreferredChannelConfigurations
: AudioProcessor (BusesProperties()
#if ! JucePlugin_IsMidiEffect
#if ! JucePlugin_IsSynth
.withInput (“Input”, AudioChannelSet::stereo(), true)
#endif
.withOutput (“Output”, AudioChannelSet::stereo(), true)
#endif
)
#endif
,parameters (*this, nullptr, Identifier (“AutomationTest”),
{
std::make_unique (“MyValue”,
“MyValue”,
0,
100,
0),
std::make_unique (“MyButtonStatus”,
“MyButtonStatus”,
false)

         })

{
}

void AutomationTestAudioProcessor::randomiseValue()
{
auto parameter = parameters.getParameter(“MyValue”);
parameter->beginChangeGesture();
parameter->setValueNotifyingHost(random.nextFloat());
parameter->endChangeGesture();
}

AutomationTestAudioProcessor::~AutomationTestAudioProcessor()
{
}

//==============================================================================
const String AutomationTestAudioProcessor::getName() const
{
return JucePlugin_Name;
}

bool AutomationTestAudioProcessor::acceptsMidi() const
{
#if JucePlugin_WantsMidiInput
return true;
#else
return false;
#endif
}

bool AutomationTestAudioProcessor::producesMidi() const
{
#if JucePlugin_ProducesMidiOutput
return true;
#else
return false;
#endif
}

bool AutomationTestAudioProcessor::isMidiEffect() const
{
#if JucePlugin_IsMidiEffect
return true;
#else
return false;
#endif
}

double AutomationTestAudioProcessor::getTailLengthSeconds() const
{
return 0.0;
}

int AutomationTestAudioProcessor::getNumPrograms()
{
return 1; // NB: some hosts don’t cope very well if you tell them there are 0 programs,
// so this should be at least 1, even if you’re not really implementing programs.
}

int AutomationTestAudioProcessor::getCurrentProgram()
{
return 0;
}

void AutomationTestAudioProcessor::setCurrentProgram (int index)
{
}

const String AutomationTestAudioProcessor::getProgramName (int index)
{
return {};
}

void AutomationTestAudioProcessor::changeProgramName (int index, const String& newName)
{
}

//==============================================================================
void AutomationTestAudioProcessor::prepareToPlay (double sampleRate, int samplesPerBlock)
{
// Use this method as the place to do any pre-playback
// initialisation that you need…
}

void AutomationTestAudioProcessor::releaseResources()
{
// When playback stops, you can use this as an opportunity to free up any
// spare memory, etc.
}

#ifndef JucePlugin_PreferredChannelConfigurations
bool AutomationTestAudioProcessor::isBusesLayoutSupported (const BusesLayout& layouts) const
{
#if JucePlugin_IsMidiEffect
ignoreUnused (layouts);
return true;
#else
// This is the place where you check if the layout is supported.
// In this template code we only support mono or stereo.
if (layouts.getMainOutputChannelSet() != AudioChannelSet::mono()
&& layouts.getMainOutputChannelSet() != AudioChannelSet::stereo())
return false;

// This checks if the input layout matches the output layout

#if ! JucePlugin_IsSynth
if (layouts.getMainOutputChannelSet() != layouts.getMainInputChannelSet())
return false;
#endif

return true;

#endif
}
#endif

void AutomationTestAudioProcessor::processBlock (AudioBuffer& buffer, MidiBuffer& midiMessages)
{
ScopedNoDenormals noDenormals;
auto totalNumInputChannels = getTotalNumInputChannels();
auto totalNumOutputChannels = getTotalNumOutputChannels();

// In case we have more outputs than inputs, this code clears any output
// channels that didn't contain input data, (because these aren't
// guaranteed to be empty - they may contain garbage).
// This is here to avoid people getting screaming feedback
// when they first compile a plugin, but obviously you don't need to keep
// this code if your algorithm always overwrites all the output channels.
for (auto i = totalNumInputChannels; i < totalNumOutputChannels; ++i)
    buffer.clear (i, 0, buffer.getNumSamples());

// This is the place where you'd normally do the guts of your plugin's
// audio processing...
// Make sure to reset the state if your inner loop is processing
// the samples and the outer loop is handling the channels.
// Alternatively, you can process the samples with the channels
// interleaved by keeping the same state.
for (int channel = 0; channel < totalNumInputChannels; ++channel)
{
    auto* channelData = buffer.getWritePointer (channel);

    // ..do something to the data...
}

}

//==============================================================================
bool AutomationTestAudioProcessor::hasEditor() const
{
return true; // (change this to false if you choose to not supply an editor)
}

AudioProcessorEditor* AutomationTestAudioProcessor::createEditor()
{
return new AutomationTestAudioProcessorEditor (*this);
}

//==============================================================================
void AutomationTestAudioProcessor::getStateInformation (MemoryBlock& destData)
{
// You should use this method to store your parameters in the memory block.
// You could do that either as raw data, or use the XML or ValueTree classes
// as intermediaries to make it easy to save and load complex data.
}

void AutomationTestAudioProcessor::setStateInformation (const void* data, int sizeInBytes)
{
// You should use this method to restore your parameters from this memory block,
// whose contents will have been created by the getStateInformation() call.
}

//==============================================================================
// This creates new instances of the plugin…
AudioProcessor* JUCE_CALLTYPE createPluginFilter()
{
return new AutomationTestAudioProcessor();
}

Yes, as noted above.

Because I am using APVTS for parameter management, those ToggleButton-linked parameters are declared in the APVTS ParameterLayout as AudioParameterBool. So isDiscrete and isBoolean should be set up automagically, but I checked anyways to be sure.

As a follow up on my previous post, I’m using Logic 10.4.5 I’m not experiencing the problem on Ableton Live (version 10.1)

Weird, right? I added DBG output to watch those AudioProcessorParameter methods being called, and I see them called in order for both Sliders and for ToggleButtons… but only Slider automation input is recorded by the DAW.

One guess I had was that there’s an issue with the timing of the beginGesture / setValueNotifyingHost / endGesture calls. With a Slider, those 3 calls are spaced out at least a little bit (beginGesture on mouseDown, setValueNotifyingHost while dragging, then endGesture on mouseUp).

But with a ToggleButton, all 3 are called in rapid succession on mouseUp. (I get the UI reasons for this.) So I’m wondering if there’s some asynchronous call being made inside the DAW to prepare a track for recording automation, and then it’s not ready in time to handle the subsequent setValueNotifyingHost and endGesture calls.

That’s obviously speculation on my part, but I’m just taking a stab at why we’re seeing different behavior between the beginGesture -> setValueNotifyingHost -> endGesture sequence when triggered by a Slider vs a ToggleButton.

I was testing on Ableton 9, but my beta tester was testing Touch mode automation in Live 10.1.2b2. He reported the problem with the AU format, but that there was no problem with VST3.

Are you checking AU vs VST3 in your Ableton 10.1 environment?

That’s a sound theory… I’ll see if I can find a way to stagger the calls with a timer, although it won’t help with the existing ButtonAttachment implementation.

I’ve only tested on AU so far.

Update: I tried staggering the beginGesture -> setValueNotifyingHost -> endGesture sequence with a timer, but it made no difference.

I also tried calling setValue on a test slider (with an attachment), which didn’t get recorded either. Maybe that shouldn’t though.

Bummer. Thanks for trying that though.

Update: while I didn’t see a problem in Logic (yet), in Studio One, three of my ToggleButtons cause all automation to cease as soon as they change state. Two others do not do this, and I haven’t figured out any reason for the difference yet. I’m thinking of dropping those ButtonAttachments and using a Listener to get updates and sending updates on my buttonClicked() callback instead.

I tried it on a computer running Logic 10.4.4 this morning and had no issues. I think it’s a 10.4.5 thing…

I am running into this exact same thing.

I can also confirm that this is working as intended in Logic 10.4.4 but is broken in Logic 10.4.5…

That’s a pain! I will keep looking for a solution in my project. Please let me know if you find something that works for you and I’ll do the same