Just threw together some classes for basic gesture recognition.
It’s pretty basic, using a simple moving box boundary crossing technique. A GestureDetector can listen to mouse messages (either globally or on specific components), and will send a GestureEvent to registered GestureListeners when a gesture is completed.
A GestureEvent consists of an array of Directions, and a ModifierKeys. These can be tested for a particular sequence then an action can be performed.
Stuff to do:
currently it uses all mouse buttons to trigger the gesture - it should be either right mouse button or an assignable button.
resizing triggers gestures too - this is probably fixed by changing it to right-mouse button anyway
create a link with ApplicationCommandManager using a GestureListener, allowing the gestures to automatically trigger known commands.
Feel free to do what you want with this. I just wanted to get it to the point where it could fairly reliably generate a series of directions you could test against from a mouse movement.