BR: Windows touch and gestures don't seem to work

I’m interested in getting pinch to zoom support working in my apps on Windows.

Using my Thinkpad P50, I know for a fact that the trackpad sends a pinch to zoom event because it’s detected as such in (most?) web browsers, and even MSVC detects this to change the zoom. So, I’m assuming this means I have a mouse input source that’s either touch or gesture compatible (sorry, still learning the difference and I’m not sure which is which!).

The issue is this; when debugging my apps, I don’t receive either type of Win32 event. I do see that the touch input is registered (canUseMultiTouch is true and registerTouchWindow returns true). And according to the MSDN, the window handle can be set to touch or not… By default, gestures are enabled. When touch is enabled (as per registerTouchWindow) then gestures are disabled. So I tried not calling this at all to see if I can gestures and… nope!

As I’m grasping at straws troubleshooting this, I tried sticking a call to GetMessageExtraInfo during the mouse/pointer down events and it comes up with 0.

Anybody have any ideas here?