gsoc 2013 idea - Customizable gestures

Michal Suchanek hramrach at gmail.com
Fri Apr 12 08:34:54 PDT 2013


On 12 April 2013 13:50, Alexander E. Patrakov <patrakov at gmail.com> wrote:
> 2013/4/12 Peter Hutterer <peter.hutterer at who-t.net>:
>> Hi guys,
>>
>> Unfortunately, the entry for gesture recognition in the synaptics driver
>> should have not been on the list. synaptics is the wrong place in the stack
>> to do gesture recognition. we support a minimal set of gestures and they
>> already give us more headache than benefit. full gesture recognition in the
>> synaptics driver would be an unmaintainable nightmare. for that reason, even
>> if you could get it to work in a proof-of-concept I would not merge the
>> result into the upstream driver.
>
> I can understand this position. However, this also poses a question:
> what counts as a gesture and what doesn't. E.g., on a clickpad, one
> can click in the bottom right part of the pad in order to get this
> recognized as a "right button click". Or, one can swipe along the
> right edge in order to scroll. Are these two examples gestures, or
> not?
>

They are synaptics-specific gestures. There is no reason why any other
absolute input device could not make such gestures available. I would
gladly turn off multitouch gestures and replace them with these more
usable synaptics gestures on my wacom tablet.

It's true that gestures are usually understood in relative sense - eg.
left ro right two-finger swipe in any part of the touch surface. But
absolute gestures that are performed on a particular part of the touch
surface are required to support devices with touch buttons (some iPad
like tablets) and legacy synaptics behaviour.

Thanks

Michal


More information about the xorg-devel mailing list