I’ve noticed a few limitations in the mouse handling lately, which are preventing me from implementing certain interface features.
If a drag gesture is started with the one mouse button, I am not able to detect a change in state of any other mouse buttons until after the first has been released. Neither am I able to receive wheel events.
I’ve implemented a navigation strip which allows the user to scroll horizontally using the left button, or stretch/zoom horizontally (anchored at the left edge) with the right button. These features are fine in isolation, but I want to also be able to introduce zooming from vertical motion if the right mouse button is held whilst already scrolling with the left button.
This currently appears to be impossible, at least using the events provided in the mouse callbacks. I thought maybe I could work around it by looking at the mouse source directly (Desktop::getInstance().getMainMouseSource().getCurrentModifiers()) but they’re the same.
Is there anything I can do?
 it works fine with keyboard modifiers (i.e., if I press Ctrl during a drag, it is picked up), so I can’t really see a reason why it shouldn’t behave that way for the other buttons. Unless it’s an OS limitation, but I can’t remember encountering such a thing in the past.