In my application when I receive an ON_WM_RBUTTONDOWN()
message in a certain window I create a CMenu, populated with some items, and then displayed with TrackPopupMenu(xxx)
. It has no other interaction with Windows messages to be created. It defaults to accepting left clicks to select items and I can see these messages coming in when I use the mouse.
I'm trying to allow use of this context menu on a touch screen - the parent window can receive WM_GESTURENOTIFY
messages (for other functionality) and in all other aspects of my app, such as other windows and dialogs, it handles touch gestures fine - Spy++ shows gesture messages and a WM_LBUTTONDOWN
which gives me normal behaviour across the app. I CAN touch select menu items when this menu is opened with a physical mouse right click, with the touch input coming through as a WM_LBUTTONDOWN
.
I've tried creating and displaying this menu by calling that same creation code again from a touch message, or just sending the window an ON_WM_RBUTTONDOWN()
message manually after a touch, with the same flags. This is created fine and works as normal with a mouse, and as far as the app is concerned nothing is different. However, the CMenu is not receiving any touch messages at all - I get the touch-style cursor showing where I'm tapping, but nothing is being piped through to the menu.
I've tried changing from gestures to registering touch while this interaction happens and ensuring the original gesture handle is closed in case it was blocking for whatever reason.
My assumption is that Windows is doing something additional behind the scenes beyond what my app is aware of to allow these messages to be sent through, so I'm a bit stuck for a solution.
I was able to get around this issue by enabling the tablet press-and-hold gesture (it's normally disabled) which serves the purpose of being treated as a right click and having a properly interactable context menu, rather than sending the right click message myself. Works on desktop with a touch screen and a Windows tablet.
Adding
ULONG CMyView::GetGestureStatus(CPoint /*ptTouch*/) { return 0; }
was what enabled this to work.