How to do iOS-Style Multitouch handling?

Hi everyone,

Here’s what I’m trying to do: I want to simply move around a few images on the screen, and enable multitouch, such that on platforms that support it, the user can move around multiple objects at the same time. Furthermore, I cannot move all the dragging-related code into a custom image class, as there are some things that need to be handled centrally.

In iOS the different touches would be identified by pointers to UITouch objects, which are consistent across touch/mouse down, drag, and touch/mouse up.

As I understood from the JUCE Multitouch example, the equivalent to the UITouch pointers is basically a reference to a MouseInputSource object (please correct me if I got that wrong), and on this forum I’ve read that it is ok to keep a pointer to that reference (again, please correct me).

However, trying to implement my Multitouch handling, I realized that even on a device with a single mouse input, I get one pointer for the MouseInputSource in the event for the mouseDown function and a different pointer in the event for the mouseDrag function.

Is this a bug or a feature? And, more importantly, how do I identify touches correctly across the different types of events that can occur?

Fritz

1 Like

No i don’t think you should be keeping a reference to the MouseInputSource. However, the return-value of MouseInputSource::getIndex() will not change between event callbacks.

1 Like

Just got around to try this on a Multitouch device. Works great, thanks a lot!