Making popup menus touch enabled

I have a Dell convertible laptop that has a built-in mouse pad plus a 10-point touch enabled screen that can be folded around to turn into a tablet (

We are trying to get our software fully touch enabled, however the PopupMenu class is not touch friendly. There are two main issues. Currently it uses the mouse’s location (not where you touched) as a default menu location. I can fix that problem by specifying the popup menu location instead of using the default, of course. The bigger issue is that popup menus can not be selected or closed by touch, only by using the mouse. The TimerCallback() method explicitly uses the current mouse location, which gets called from any mouseMove(), mouseDown(), mouseDrag() or mouseUp().

I’ve been working on this for a while now with some progress, but I haven’t completely solved it yet. Touch inputs are handled and translated into corresponding mouse methods, but inside TimerCallback() I loop through the array of mouse sources and often their location data never changes regardless of where I touch on the screen.

Has anyone solved this yet, and if not, what is the best way to make the popup menu class fully touch enabled so that I can use my laptop in tablet form with no mouse at all to select menu options by touch and also touch outside the menu to make them close?


I don’t have a touchscreen Windows box, but menus seem to more-or-less work on my android touch-screen. Obviously they’re not really designed for touch, but I don’t see any of the problems you’re talking about here. Perhaps the real problem is in the Windows touch input rather than the menus?

The issue is that the laptop has both a real mouse and touch inputs. The timerCallback() method calls Desktop::getMousePosition(), which does correctly return the position of the mouse, but not where I touched. So I need to determine the screen coordinates of where I last touched and use it instead of the mouse cursor’s position.

Hmm. Perhaps getMousePosition() should be smarter, and could return the last position of any of the touches, rather than just the mouse pointer itself…

I thought about doing that, or updating the mouse’s position when there is a screen touch, but in a multi-touch/mouse mixed system it may be good to keep them separate.

Here’s a little more information about this issue.

If you open a menu using the mouse first, then the touch input does work, mostly. There is one issue in this case though. You have to touch the menu option you want twice. The first touch doesn’t select the item, but the 2nd touch does. If you open a menu with a touch input, then nothing works. Tracing this, I can see that no WM_TOUCH messages are ever sent to the window. They are only sent when the menu is initially open via a mouse click.

This problem actually exists elsewhere, too. I noticed if I use the mouse to do something that opens any dialog, I have to touch the ok/cancel buttons (or any other gui component) twice to get them to work. Just like in the menu case, the first touch doesn’t work. And likewise, if I use touch to open a dialog, then I can’t use touch to do anything in the dialog since it also doesn’t get any WM_TOUCH messages.

Do you have any thoughts on how to solve this?

It’d probably be quite easy for me to fix if I had a Windows touch-screen to use… I’ll probably put one on my xmas present list, but in the meantime, I’m not sure what to suggest, other than looking at getMousePosition() like I suggested, and maybe trying to make it take all the input sources into account.

Just want to say that this is still an issue on windows 10, also calloutbox cant be dissmissed

1 Like