I’ve just added a Component::mouseMagnify method to handle pinch-to-zoom gestures… It works on the mac, and I’ve written a win32 implementation too, but I don’t have any kind of input device for Windows with which I can test my code… Anyone able to check it out, and maybe trace into the WM_GESTURE message to see what’s up if it fails?
Looks interesting. I’ve been using my own `Pinch’ class to do this. I dont have a windows gesture test, but i’ll give it a go on mobiles when i get the chance.
It won’t do anything on mobiles at the moment - it just responds to the gesture events sent by the OS, i.e. touchpad gestures (and maybe touch-screen pinching on Win8 if such a thing exists)
I don’t know if this is the best place for this post but here goes.
If I touch on a file in FileBrowserComponent ‘B’ that is covering part of component ‘A’ and start to drag, sometimes a mouseDown, mouseDrag, mouseEvent event is being sent to component ‘A’.
The FileBrowserComponent has been made a dragAndDrop source.
This only happens when the FileBrowserComponent is covering the ‘A’ component. It does not matter if ‘A’ is a target.
This doesn’t happen all the time, but most of the time. I haven’t been able to find why, but I’ll keep looking.
BTW, Is there is a direct way to make FileBrowserComponent a dragAndDrop source. I had to alter the juce source to do this. Did I miss something?
On Windows 8 Touch Screen.
Yeah, I’ve also been finding interesting ways to cause havoc with Windows8 touches. Will be digging into that more deeply very soon, as well as adding some new gesture-recognition classes.
Probably not, it’s not something I did. Might be worth adding though.
Component::mouseMagnify is not being called on Windows 8 touch screen. Also it should have a "bool isHorizontal" parameter.
Also why is there not a mouseSwipe, mouseTwoFingerTap, etc.?
These are very useful features with touch screens.
Mapping touch events directly to mouseEvents is not sufficient. Mapping to mouseEvents can be the default but not the solution.
PLEASE THINK THIS OUT, because it is the future whether we like it or not.
I have updated to a recent juce version and was hoping this was working. I implemented most of the gestures in a previous juce version, but now have to go back and implement it again.
What do I have to do to get an answer? I watched four posts come after this post that were answered.
Jules, if this is too diffficult to implement, or there is not enough time, I will do it, again. Many months ago you said that you were implementing the touch gestures through listeners, when I suggested though virtual functions. The touch listeners have not been implemented. I have waited and can't wait anymore. I hope to ship a new product by the end of the year and I need the main touch gestures.
I understand if there is not enough time or it is too difficult, but please give me an answer so I can do what I have to do. I realize the difficulty in coding so it works with a mous and touch.
So if I implement this again can get it put in the source code, so I don't have to reenter the changes each time there is a juce update. I will make the changes as close to your source code as possible.
Searching the forum reveals plenty of people want touch implemented in juce. It would be a good compliment, especially when creating a new interface.
Sorry, but if you really want my attention, then aggressive posts that tell me what to do in capital letters will have the opposite effect.
Gestures are important. No shit! I need to implement this stuff myself for other projects that I'm working on, too. But my priorities and deadlines may not align with yours, and like any other feature, I'll do it when I can. If you can't wait, then do whatever works for your app.
PLEASE THINK THIS OUT was not telling you. It was begging you. I needed an answer because I am under a deadline. As I stated before I waited for months for this to be implemented. If you're too busy, I understand (we all are), but months back you said you were working on this and were going to implement it using listeners, so I stopped implementing touch based on your statement. I still think virtual methods are an cleared method for the end user. Just like mouseDown, there could be a touchDown, touchSwipe, touchMagnify, etc. based on touch events not mouse events.
I'll give you a reason for not mapping touch events to mouseEvents. Say you had a component that you could drag a selection area over objects to select them. So you click and holding the mouse down while dragging. Now if gestures are mapped to mouse events and you tried a swipe, you would be doing a quick select and not a swipe. They need to be separated. Yes it is a pain, and I am struggling to get it working in a way that works for mice and touches, but I think it needs to be separated this way. iPads, iPhones don't need to resiolve the conflict between a mouse and a touch, but touch screen computers do.
Hopefuly I can get it working in a way that benefits everyone. I can't decide whether to use WM_TOUCH events or WM_GESTURE events. Both have their benefits. And also, my program is cross platform, so touch needs to to work the same on both Windows and Mac. What a pain to implement! That is why I was hoping you would implement it.
Not sure whether you know this, but WM_GESTURE doesn't actually work if you have mutli-touch input enabled - the OS will either send your app touches, or gestures, but not both. That's why my plan is to do a platform-independent version. And you know you can find out whether a mouse event comes from a finger or a touch by looking at the MouseInputSource? So if you want only mice to do a lasso-selection and fingers to scroll, that'd be easy to do.
Yes I know that you have to choose WM_TOUCH or WM_GESTURE. I have already trapped mouseDown events when trying to use WM_TOUCH. I don't know know why Windows sends a mouseEvents when touch is enabled?
My problem is I am greedy, I want both swipe and lasso with touch. If the user touches the screen and drags over an area I want the lasso effect. If it is a quick swipe, then I want swipe. I have seen apps do both.
After working with both methods I find WM_GESTURE easier but I agree for juce and platform independence WM_TOUCH would be better. The problem is converting touches into gestures. I have the WM_GESTURE working for everything except implementing lasso and a swipe. I have distinguished them by the time between the first touch and the first actual gesture call, but I don't have it working all the way. There are still some delays and position problem. I'll let you know when I have a working solution.
Does anyone know if the mouseMagnify callback works on windows 8? The callback is not getting called for me when pinching...
I have a windows 8 laptop w/ touchscreen. Gestures are enabled...
Just wondering if there are any plans to add pinch and zoom functionality for mobile soon?
On mobile it’s a bit different, as it’s not an OS callback like for touchpads. You can implement it pretty easily yourself, and TBH most situations on mobile would call for a custom implementation anyway, as you’d probably want to be able to grab and drag as well as zoom, and its behaviour will depend on what your app is doing.
Ok. But are you able to point me in the right direction to getting started? Like, for example, should I assume that JUCE will allow me to handle two concurrent mouseDrag events? (one for thumb, the other for finger)
Yes, of course, that’s how multitouch works.