Juce, iOS and unstable touch screen response

I’m making a drum App that has touch pads on the screen. These work perfectly in Windows but are laggy and unstable on the iPad, unsuitable for live playing like other similar apps not made with Juce.

When running the App in the debugger, Xcode often displays this message:

<_UISystemGestureGateGestureRecognizer: 0x10111c500>: Gesture: Failed to receive system gesture state notification before next touch

I’ve tried different approaches, even tried to replace my pad components with the stock Juce keyboard and the result doesn’t change. Also went into the iPad’s settings and disabled all gestures, but to no avail.

My code is made up of two classes, one draws a pad, and another draws multiple pads on the screen, something like this:


struct MyDrumPad : public juce::Component
{
	MyDrumPad(const String& name) : Component(name)
	{
		// give a shape to this path
		path.addPolygon ( ... );
	}
	
	void paint(Graphics& g) override
	{
		// Draw the pad
		g.setColour( /* set the color */ );
		g.fillPath(path);
		// ...
	}
	
	void resized() override
	{
		// Give the correct size to the path
		path.applyTransform( ... );
	}

	juce::Path path;
	int note = -1;
};

struct MyPadBoard : public juce::Component, public juce::MidiKeyboardState
{
	MyPadBoard()
	{
		// Create the pad array...
		StringArray padNames = { "Kick", "Snare", "etc...", ... };
		for (int i = 0; i < 24; ++i)
		{
			auto* pad = new MyDrumPad(padNames[i]);
			addAndMakeVisible(pad);
			pad->note = 36 + i;
			pad->addMouseListener(this, false); // this transfers the mouse interaction from each pad object to this class
			pads.add(pad);
		}
	}
	
	void resized() override
	{
		// Arrange the pads in a certain order...
		
		for (auto* pad : pads)
		{
			pad->setBounds( ... );
			// ...
		}
	}
	
	void mouseDown(const MouseEvent& event) override
	{
		// This is where I trigger the note-on event whenever a pad is touched/clicked
		
		if (auto* pad = dynamic_cast<MyDrumPad*>(event.eventComponent))
		    noteOn(1, pad->note, 1.f); // noteOn is a method of MidiKeyboardState
	}
	
	
	OwnedArray<MyDrumPad> pads;
};

Now that I have my custom keyboard (padboard) I can instantiate it from my caller class, typically the PluginEditor, and set the MidiKeyboardState::Listener

PadBoard.reset(new MyPadBoard());
addAndMakeVisible(PadBoard.get());
PadBoard->addListener(this);

Then the listener functions process the Midi Note events and transfer them to the processor class.

void handleNoteOn(MidiKeyboardState* source, int midiChannel, int midiNoteNumber, float velocity) override
{
    MidiMessage m(0x90 | midiChannel - 1, midiNoteNumber, (int)(velocity * 127.f), 0);
    audioProcessor.addKeyboardMessage(m);
}

I also tried a different approach using lambdas, but I just got the same results: snappy and precise on Windows, unstable on iPad.

This is with Juce 6.1.6.
Tried with two different iPads running iPadOS 15.7.
Not tried yet on macOS.

Anything I should know that I’m missing? Thanks in advance for your time.

Hi there, I am also working on an app where the touch screen response is important, but I don’t have an iPad to test. Does this just happen on the iPad? Have you tried on an iPhone? Also, is there a difference in release mode or on the simulator?

I understand that the UISystemGestureGateGestureRecognizers are used to avoid unintended touches at the top and bottom of the screen. Perhaps the problem is in some way related to this. Have you seen this thead?

What’s actually happening inside this function? If you’re just storing messages and sending them all-at-once at the beginning of the next audio callback, the timing might suffer considerably. e.g. if the block size is 2048 samples, at a sampling rate of 44.1KHz, events will effectively be quantised to 50ms intervals.

Do you see timing issues in the JUCE demos, e.g. in the AudioPluginDemo? If not, you could take a look at how the MIDI messages are communicated to the processor there, and check for differences with your own implementation.

I don’t have an iPhone and don’t compile on the simulator, I always use the real iPad to test my Apps.
No difference between debug and release mode.
The pads in my App are distant from the screen edges, and BTW I have disabled edge gestures.
Yes, I have seen that thread but haven’t tried to apply that patch. I don’t want to modify the Juce code base.

1 Like

This is the function:

// On-screen keyboard
MidiBuffer KeyboardMidiIN;
void addKeyboardMessage(MidiMessage& msg)
{
    if (msg.isNoteOnOrOff()) msg.setNoteNumber(msg.getNoteNumber() - OctaveShift);
    KeyboardMidiIN.addEvent(msg, 0);
}

Events are just added to a MidiBuffer, which is then added to the midiMessages coming into the processBlock function:

void MyPluginAudioProcessor::processBlock (juce::AudioBuffer<float>& buffer, juce::MidiBuffer& midiMessages)
{
	// Receive events from the touch keyboard
	midiMessages.addEvents(KeyboardMidiIN, 0, KeyboardMidiIN.getNumEvents(), 0);
	KeyboardMidiIN.clear();

	// ...
	
}

I can’t test the AudioPluginDemo because it crashes on the iPad as soon as I launch it, and the stack trace shows only assembly code that I can’t understand… it’s something about the message thread. It works in a simulator, but that doesn’t help because I want to be sure that the touch response is correct on a real device.

Ok, I have modified my code to work exactly like the AudioPluginDemo:

  1. my PadBoard class doesn’t inherit from MidiKeyboardState anymore but uses a MidiKeyboardState object that is referenced from a pluginProcessor class member;

  2. rather than having a second buffer to copy the messages, I’m now using KeyboardState.processNextMidiBuffer(midiMessages, 0, buffer.getNumSamples(), true); directly into the processBlock function.

Same result, touches are still laggy, you can’t play a beat on the screen. Of course the rest of the audio engine doesn’t suffer from this lag issue, it’s perfect if you play it from a Midi keyboard or from a pre-recorded track in a DAW.

And the debugger in Xcode is still showing these messages:

<_UISystemGestureGateGestureRecognizer: 0x10131be80>: Gesture: Failed to receive system gesture state notification before next touch

A few things to note here:

  • iOS devices sample touch events at 60 Hz so you will end up with a jitter between 0-16.6ms. There is nothing that can be done about that base line.
  • Touch events are received on the message thread so anything that clogs up the message thread (e.g. painting, I/O, etc.) will further result in latency.

So try to remove any animation etc. and see if the problem persists

Interesting point. I have indeed a few animations: the pads light up when they’re hit, and slowly fade out; and there’s another element similar to a vu-meter that constantly repaints itself. The whole UI is not repainted, I don’t use a timer to repaint all widgets, I only repaint them when a parameter changes its value.

Disabling all animations was the first thing I tried… pretty easy indeed, I just commented out the startTimerHz(30); call in the class constructor. Unfortunately, that didn’t change anything.

Is the lighting up of the pads when they are played also laggy?

I’d say no, but it’s hard to tell as visual perception isn’t as fast as sound.

I think I have resolved the issue, and the cause could be related to what luzifer is saying here.

The UI in my plugin was using 67 DropShadower objects to add shadows to each component (mostly knobs and sliders). These elements are not redrawn unless a parameter changes value from Midi or from host automation. I use the ActionBroadcaster / ActionListener to trigger UI elements redraw only when really needed. For this reason, I was assuming that, once the UI has been drawn, shadows would not need to be redrawn for no reason. Apparently I was wrong… there must be something somewhere buried in the Juce code that performs some heavy operations even if mouse clicks happen outside the components that own a DropShadower, so each time I touch the screen to play a drum pad, all shadows do somethng that slow down the message thread.

So, why was this not an issue under Windows? Because Windows, at least on my computers, is freaking lighting fast! It must not be the same on the iPad, at least graphic-wise.

Removing all shadows solved the issue. Now my visual drum pads respond well, snappier than before even at 44100 / 256, even better with 128 or 64 buffers.

Lesson learned: Juce + iPad = slow graphics, don’t push it over the limit!

1 Like

Glad you got it solved!

There’s a lot of discussion about replacing the inefficient DropShadower with StackBlur on this thread. There are some good examples there which should render the shadow issue moot (pun intended!).