How does mouse event synthesis work on touchscreen?
Peter Hutterer
peter.hutterer at who-t.net
Fri Nov 20 01:48:23 UTC 2020
On Thu, Nov 19, 2020 at 05:11:36AM +0100, Grósz Dániel wrote:
> I'm not sure if this is the right place to ask, or where to look this up.
> Please give me a rundown (or links) on how mouse event emulation works on
> touchscreens.
In general:
- the kernel gives you x/y coordinates for each touch, together with a
single-touch x/y coordinates. There is no left/right button, you get BTN_TOUCH
and sometimes pressure/size. Where devices support pressure/size, the
BTN_TOUCH may be just a hardcoded 'if (pressure > value) send BTN_TOUCH'
This is the case for both touchscreens and touchpads, they look
effectively identical except for a single bit that tells us something is a
touchscreens (INPUT_PROP_DIRECT).
- for touchscreens, libinput takes those events, does barely anything with them
and forwards them on to the next level
- the xf86-input-libinput driver takes those libinput touch events and
passes them to the X server (xf86PostTouchEvent)
- the X server does pointer emulation for legacy clients for the "first"
touch. Touch events are processed as touch events in most of the server,
but where a touch is pointer-emulating *and* the receiving client doesn't
know aobut touch events, the server will emulate the matching pointer
event instead.
You can follow the rabbit hole by starting in ProcessTouchEvent, that's
the first function to be called for handling touch events during the
server's main processing loop (as opposed to *collecting* touch events in
its input thread, see GetTouchEvent).
The actual decision on whether something is a touch or a pointer event is
made in DeliverTouchEvent
What's visible on the protocol and the server behaviour is detailed in the
XI2proto.txt specification btw.
>
> Motivation: I'd like to implement mouse event emulation for touchscreens, such
> as
> - long tap for right click
> - two-finger tap for middle click
> - two-finger drag for scrolling
> - tap-and-drag for dragging
> - perhaps some gesture for relative rather than absolute pointer movement
> (like a touchpad) for when mode precise cursor movements are desired.
A side note here: if a touchscreen isn't precise enough to interact with a
specific UI, that's something you can really only fix by adjusting the UI.
Things like relative pointer movement on a touchscreen is more effort than
you'll get out of it.
> I could imagine several ways it could work:
> - X.org always sends only touch events to applications, and it's up to the
> widget toolkit (such as Qt) to synthesize mouse events if the application
> doesn't explicitly handle touch. (This doesn't seem to be the case, as even
> ancient apps react to touch input. However, on Qt 5, QMouseEvent::source()
> returns Qt::MouseEventSynthesizedByQt on mouse events corresponding to
> touchscreen touches.)
This is true but *only* if the application announces touch event support by
registering for XI2.2 touch events. In that case it will get the touch event
but not the pointer event and the rest is up to the toolkit. The touch that
*should* emulate the pointer has a flag set so the toolkit knows which one
to pay attention to. [note: from an X server's POV there is no difference
between toolkit and application]
> - X.org always sends both touch events and emulated mouse left button events.
> It's up to the application or toolkit to figure out if these belong to the
> same user action, and only handle one of them.
> - X.org sends a touch event. The application responds whether it handles it.
> If it doesn't, then X.org sends an emulated mouse event.
These two are not correct
> - An application tells X.org in advance whether it is touchscreen-aware or
> not. If it is, then X.org only sends touch events, otherwise it only sends
> mouse events.
Correct, a client that registers for XI2.2 touch events will get those,
otherwise some derivative of the pointer event (there are 3 of those, XI2, XI1.x
and core pointer events).
> - X.org synthesizes mouse events for some touchscreen inputs, and in that case
> it only sends them as mouse events. When there is no corresponding mouse
> event, it sends touch events. (This doesn't seem to be the case. Even
> multiple-finger touches work as left clicks in applications that don't have
> specific touchscreen support, while single taps and drags are not always
> handled the same as mouse left-button drags.)
This is not correct
>
> My questions:
> - Which of these (if any) is correct?
See above
> - If mouse event emulation happens somewhere in X.org, where does it happen?
> In the touchscreen driver?
In the server itself.
> - Is what I want feasible at all? I'm afraid that preventing conflicts with
> synthesized left button events is only theoretically feasible (without
> modifying the widget toolkits) if the synthesis always happens in X.org.
It's theoretically possible, but somewhere between tricky and "oh dear, what
have I done". The main problem is that touch points do have certain
promises (like a "begin update end" sequence), so you cannot switch a touch
point to being a different interaction.
The evdev driver used to do hold-to-click emulation by discarding the touch
events in the driver and sending pointer events instead. Which is a bit
strange too because to do this correctly you need to set up every touch
screen to have LMR buttons. Mind you, that implementation predates the
existence of true multittouch support.
> - What's the best way to go about it? Modify the touchscreen driver (such as
> libinput or evdev)? Interpret touch events coming from the relevant device in
> /dev/input, and create fake mouse events with uinput? Capture touch events
> using an X client, and fake mouse events using XTEST?
You can't reliably capture those touch events unless you're the window
manager.
Now, I understand what you want to do but the only thing I can really
recommend here is "fix the application" (or the toolkit). You can get this
functionality into the server with lots of effort and difficulty but if you
go down that way there'll be a brick wall eventually. Which most likely will
be that after months of struggling to implement this, you won't find anyone
with the time to review those patches.
Cheers,
Peter
More information about the xorg
mailing list