E.mods.isRightButtonDown() never set after long-press on touch screen


On windows, when you long press on a touch screen, this becomes a right click, so, when doing this on the desktop on my Wacom Cintiq touch, I get the pop-up menu i'd normally get when right-clicking with the mouse.

But, in a juce component, neither the mouseDown or MouseUp callbacks are invoked with the e.mods.isRightButtonDown() ever returning true.

I assume this is a bug, no?

If not, how should I get a right click this way with a touch screen on windows? Note I am using the Juce tip from last week, running windows 10.

Note that when right-clicking with the mouse, or, interestingly, when long-pressing with the Wacom pen, it works as expected.


No, it's not a juce "bug", in the sense that nothing is going wrong. You may want to support that behaviour in your app, or you may not. For example you could be writing a game where you have to hold your finger on something to control it and it'd be a disaster if we added some juce code that suddenly decided that after a second the tap turns into some other kind of event.

So really it's up to you to write the logic that does this. If Wacom's pen driver does it for you, that's because these are more specific devices than just touches, so it's OK to enforce such a behaviour on the users.

Thanks for the reply, I understand your point, this should absolutely not be built in to juce if it were not a windows-level feature! But it seems to me that it is:

This happens on both my touch-screen computers, I've now also tested on my Lenovo Yoga 14, running windows 8 (the wacom machine runs 10). The Lenovo is your average cheapo windows touch-screen, and not at all a specific device like the wacom screen.

When I press and hold with the finger in my Juce program, the transparent rectange indicating a long press appears, meaning it is a windows/driver-level functionality. And, after seeing that rectangle indicating the long press appearing, one would assume as a user, that the right-click that corresponds would follow.

In all software, (e.g. windows explorer, firefox, Notepad++, WinRAR, etc ), this work: after the rectangle around the finger appears, indicating a long-press,  right-click is triggered.

If it were as you say, that it is up to the app developer to support this or not, then the rectangle indicating long-press shouldn't be appearing either, no? At the moment it appears everywhere, JUCE included, but then in JUCE the expected behaviour, the right-click, fails to happen.

If all other software I've tested behaves this way, then, I would say that JUCE not doing so is inconsistent, and, may I insist, somewhere a right-click event from windows gets lost...


It's not a low-level Windows-level thing, because if it was, then Windows itself would send the appropriate mouse events to trigger a right click. Most likely it's something that's built into some Windows controls, at a level above that.

If we were to add it to juce, it'd really need to be done as a gesture-detection thing, which would require a mouse-cancel event that we could send to the component that you were clicking on at the moment that the touch turns into a right-click. It's basically the same problem as detecting swipes and other gestures, and we don't quite have the underlying architecture for it in place yet.

Ok, too bad, that leaves no good option really...

If I simulate this in my program, I also need to figure out how to check if it is enabled or not in windows (it can be turned off there in the settings panel). If I can access that information at all from within Juce that is...

The number of laptops with touchscreens is growing, and right-click is by no means an esoteric feature! This issue means one cannot design a windows-desktop program that uses right click, or if it does, it will need to come with a warning text that it will not work with any one of the many touch-screen laptops out there.

Anyway, I'll see what I do for my program, but I wanted to point out the above: being able to right-click on one of the many touch-screen laptops out there is not a fringe use-case...


Yes, thanks for raising the issue.

Not sure exactly what the best plan would be. Like I said, I guess that native Windows apps probably handle it via some kind of logic in the C# classes, so we'd have to emulate the behaviour at some level, either in juce or in your app.

my 2 cents


Hi, yes, I know, but that's on the end user side... One shouldn't really ask ones users to disable a feature in windows to use one's software!

If anything, press-and-hold is quite a useful feature, I'd rather have it on :)

my point was rather the opposite.

It probably should be enabled for it to work.

I have revisited this issue, since I want to use the long-press gesture to get a right -lick pop-up menu in my software.

(for cut/copy/paste etc…)

Otristan pointed out a setting on windows to enable right-click from long press, but that one is enabled in Windows by default, and disabling it/re-enabling it makes no difference to how Juce behaves.

I revive this thread because I also saw another post here about something very similar being requested also for Android: How to reach TextEditor context menus in Android? Long press or right click not working

And a post collecting issues about Juce and touch screens: Some observations from using Juce on touch screens

…Which this also very much is.

As far as I understand, right-click emulation via long press with touch/pen is a low-level feature, allowing non-touch-aware applications to function properly. The problems arise when your application is made with touch support, and you have to respond properly to all possible messages in all possible scenarios where you may have a touchscreen, or a pen, or a mouse, or any combination of them.

The last version of Chrome browser is not able to respond to emulated right-clicks via long press with the pen, on my tablet, but it works properly when long pressing with the finger. In previous versions of Juce right-click emulation worked properly with the pen, and didn’t work with touch. The current version of Juce doesn’t work neither with touch nor with pen. Notepad++ works with both.

I didn’t check the sources, but maybe Notepad++ is built without touch support. This leaves all the emulation to the driver level. Is there a way to turn off touch/pen support in Juce/win? We could test the differences in our applications and decide whether to buikd them with touch support or not…

Another topic relevant to this one:

@Jules @splisp

I realized I could do an easy test to see how legacy windows software deals with reacting to mouse events from touch.

I downloaded Notepad++ version 1 from 2003 and ran on Windows 10 with the Wacom Cintiq.

I also installed Acdsee 2.42 from 1997, http://www.oldversion.com/windows/download/acdsee-2-42

Right click works fine in both, both with pen and finger long press!

So it must be that these events are generated by the low-level Windows API in Windows 10, and that there’s no C# involved anywhere whatsoever, in 1997 C# had not even been conceived of as far as I remember :slight_smile:

Sure, they’re not multitouch enabled, but does the API disallow receiving plain-old right-click events if the software is multi-touch enabled? I imagine not…

If your code handles touch and pen events, then it sort of gives those events some precedence over emulated mouse messages from the Windows system, so your application has to re-implement behaviours like right-click emulation, and the likes - and do it in a fashion that is consistent with the behaviour of other applications.

As long as your application needs NO explicit pen/touch behaviour, it’s better that you don’t reply to those messages, and act like you don’t know there is a pen or a touch screen. The system will emulate all the mouse messages for you.

From the point of view of a GUI framework like Juce, the best thing is to let an application decide whether it wants to implement pen/touch specific functionalities or not, by way of some compile-time flag that excludes the support for it. Then, if an application has required pen/touch handling, it must implement right-click emulation from long-press by itself - if that’s the case, because a touch-aware app might use long-press for some specific behaviour, who knows! Maybe, the longer you keep pressed, the darker the item you’re touching becomes, or who knows what!

Still, even with pen/touch support active, plain widgets like menus and buttons should work as expected without the need for the application developer to fix them. Here, it would help the developers of the framework to constantly test the widgets on a pen device :wink:

Thank you for the explanation, it makes sense, as you see I hadn’t given it the necessary thought :slight_smile:

But I also see your point about the default widgets in a library as being good targets for such default behaviours, even when multi-touch enabled, as is Juce’s case. A user can always override if they want behaviours such as what you describe!

I’d like to elaborate some more: the default widgets are in charge to catch pen / touch events, when the support is enabled, and apply emulation of standard mouse events. Juce has no native widgets, so this is the way to go. For a standard text field on a Windows tablet, even with pen support enabled the expected behaviour is still to show a popup context menu when long-pressing on it, with cut/copy/paste/selectall commands. If your framework has no native widgets, the framework widgets have to implement this functionality from scratch, by starting a timer when the click begins, and when the timer expires without the pen having been lifted showing the context menu. Again: no native widgets means different look-and-feel, and also different timing for the popup context menu, and different way to fill up the long-press time (the standard way is to draw a rectangle, or a circle, Juce will implement its own style - part of its look-and-feel).

I see no easy workaround for this. When pointer support is enabled in your application, you can’t expect receive full mouse emulation events. The basic widgets provided with Juce have to implement this functionality (hopefully with a well-tested code and not something rolled out after a quick testing on a burrowed Surface). User Components have to do the same: depending on your Component you have to decide what to make of touch and pen messages, and expect no emulation of mouse events.

Hope this helps! :slight_smile: :slight_smile: :slight_smile: