Enabling multitouch in input-evdev

Peter Hutterer peter.hutterer at who-t.net
Tue Jan 12 03:03:52 PST 2010


First of all, thanks for your efforts here. I really appreciate that you
pushed it that far and I'm sorry about the delay in my comments.

On Fri, Jan 08, 2010 at 12:55:21PM +0100, Benjamin Tissoires wrote:
> First, I try to dynamically create/detroy such subdevices. However,
> it infers very often freezes in the Xserver due to some mallocs. So
> the idea to solve this problem is to staticaly allocate the
> subdevices. 

Do you have any logs or other data on where these freezes occured? they'd be
useful for others as well since chances are someone is going to run into
this deadlock.

> This let the problem of the starting/ending events of
> the track. I arbitrary choose to implement it in the "Evdev Tracking
> ID" property. When the ID is -1, there is no touch event, and when
> it's > 0, there is a track.
> 
> That's let us 2 problems: how can we handle gestures, and should we
> create masters devices in the server part. The commonly accepted
> solution is to put it all in the client (or toolkit) part. That's
> why, for the demo I wrote a small tool that handle the masters
> devices, their grabbing, and their attachments. I did not had the
> time to implement gestures, but I thought we had enough informations
> with the subdevices to do the job.


> Finally, let's talk about the internal detection and behavior of multitouch.
> The multitouch-enabled evdev driver (since kernel 2.6.30) has some
> more events (SYN_MT_REPORT, ABS_MT_TOUCH_MAJOR, ABS_MT_TOUCH_MINOR,
> ABS_MT_WIDTH_MAJOR, ABS_MT_WIDTH_MINOR, ABS_MT_ORIENTATION,
> ABS_MT_POSITION_X, ABS_MT_POSITION_Y, ABS_MT_TOOL_TYPE,
> ABS_MT_BLOB_ID and  ABS_MT_TRACKING_ID). The only necessary event
> given by a multitouch device (correct me if I'm wrong) is the
> SYN_MT_REPORT. So I activate multitouch only when I received this
> event.

Do you activate it when you receive the event or based on this bit? It seems
checking for the bit set is the better approach. (I haven't yet looked at
your patch)

> What I mean is the problem of the detection of valuators in the Xorg
> part. From my understanding, valuators 0 and 1 are necessary x and
> y. But the evdev driver bases its detection of axes of the order
> given by the kernel evdev. If a driver does not deliver ABS_X and
> ABS_Y but only MT-events, the first two axis will be
> ABS_MT_TOUCH_MAJOR, ABS_MT_TOUCH_MINOR (the dimensions of the
> touch), that is incompatible with Xorg. So I asked Stéphane Chatty
> to keep ABS_X, ABS_Y in the Stantum driver, and it seems that it has
> been accepted by the other kernel developers.

The evdev driver is first and foremost a generic mouse driver. Which means
that devices that provide x and y are the most important to work, regardless
of relative or absolute. The need for x/y is also implied by the core
protocol. However, there's no reason that you couldn't fake up x/y
and simply leave it as a mute axis. IMO the kernel should not report axes
that don't exist.
The axis mapping evdev is a convention not a rule, so you could just
reshuffle the axis checking to cater for the multitouch situation.
That goes for the rest of the driver too, if something seems wrong to cater
for a new situation then the driver can be rewritten to suit this.

> So now, with multitouch-enabled devices, we should have the
> touchscreen emulation (ABS_X, ABS_Y) and the multitouch part. This
> allows us (in the xorg part) to choose between this two modes. By
> default, I enabled touchscreen emulation: ie the driver does not do
> anything more than before. If we (the client part) want to use
> multitouch, we can change the property "Evdev Multitouch" with the
> number of desired touches recognitions. I limited it to 5
> (MAX_VALUATORS_MT) in the patch as the creation of masters devices
> and such subdevices produces too many devices in standard (limited
> to 40 in the current trunk). We can still revert to touchscreen
> emulation by setting "Evdev Multitouch" at 0.
> 
> 
> I think I summarize my patch, now let's talk about the discussions.
> 
> Here I quote Bradley T. Hughes:
> 
> (...)
> 
> Multiple cursors is definitely a nice feedback to the user, but from
> an application and toolkit perspective, it's much nicer to receive
> all information in one event (which the Linux kernel does as well).
> I wonder if there is a way to get both. Applications and gesture
> recognizers are much easier to write if the event contains
> information about all touch points (instead of every app having to
> write a state machine to track all touch points). It's more CPU and
> power efficient too. The number of (potential) process context
> switches needed to wake up an application for each touch point vs. a
> single context switch (from Xorg to the app) for any kind of
> multi-touch event is a strong argument here at Nokia  :)  Less
> context switches means better battery life, and lower feedback
> latency for the user.
> 
> For what it's worth. Gesture recognition can be done many different
> ways, and we have added support for gestures in Qt as well (which
> build upon our QTouchEvent, which can deliver multiple touch points
> in each event).
> 
> I don't want to sound like I'm saying that "my way" is the Right
> Way. I just want to make the argument based on the experiences that
> we had while adding multi-touch support to Qt. I hope you can
> understand them and are willing to investigate the possibility. I am
> willing to help as well... I have a Dell Latitude XT laptop with an
> N-Trig screen and can help out where needed. I've never done X
> driver development though, so it may take me a little time to get up
> to speed, though.
> 
> (...)
> 
> I don't suppose you have considered trying to find a way of adding
> all the touch point information into a single event using multiple
> axes (this is certainly more efficient and easier to handle on the
> Qt side of things). Windows 7 and Mac OS X do this, as well as Qt.
> 
> (...)
> 
> <end of quotation>
> 
> So, first question: is my behavior the good one? (not being
> compliant with Windows or MacOS)

Short answer - no. Long answer - sort-of.

Multitouch in X is currently limited by the lack of multitouch events in the
protocol. What you put into evdev is a way around it to get multitouch-like
features through a multipointer system. As Bradley said, it is likely better
for the client-side to include the lot in a single event.  Since X
essentially exists to make GUI applications easier (this may come as a
surprise to many), I'd go with his stance.

However, this is the harder bit and would require changing the driver, parts
of the X servers's input system, the protocol and the libraries. It'd be
about as wide-reaching as MPX though I hope that there is significantly less
rework needed in the input subsystem now.

Strictly speaking your implementation is a converter from multitouch events
to multipointer events, hence also the need for the client daemon.

> to begin the answer, I can tell the pros and cons of my way:
> Pros:
> * MPX related so anyone can simulate gestures with two mice...
> * More fun as we can easy have one cursor per touch (we can do that
> with the other solution but by sending Xtest events)
> * Easier to develop as I already made the patch ;-)
> Cons:
> * More complicated for the toolkits as they will have a different
> behavior between the different systems

Not really. As I said above from the client side your patch implements
multipointer and a client cannot distinguish whether a gesture performed
comes from a single multitouch device, from two touch devices or two mice.
Hence the same rules as for multi-pointer applies.
(AFAICT, neither Windows nor OS X have multipointer support so this will be
system-specific anyway)

> * More costly as the client part has to reassemble the parts to
> detect/treat gestures


> The second problem was concerning trackpads:
> 
> How can we handle modern-multitouch trackpads (Broadcom 5974,
> DiamondTouch). We excluded synaptics trackpads as they don't send
> ABS_MT events but special tools events (325, 330, 333, 334 for
> ToolFinger, Touch, Tool Doubletap, Tool Tripletap).
> 
> From Stéphane's point of view, they should be transparent of my
> patched version of input-evdev as they are giving quite the same
> events than multitouch screens.
> 
> Finally, I was wondering what is your position concerning the use of
> XI-properties to notify the client of the starting/ending of touch
> events.

Property events are neither designed nor particularly well suited for this.
Their data delivery is partially out-of-band with the input events and
frequent property changes reduce the load on both the server and the client
significantly. As with the multitouch data, this should ideally be included
in a new type of event.

Cheers,
  Peter


More information about the xorg-devel mailing list