[RFC XI 2.1 - inputproto] Various fixes in response to Peter Hutterer's review
Chase Douglas
chase.douglas at canonical.com
Mon Nov 29 13:07:24 PST 2010
On 11/29/2010 02:08 AM, Peter Hutterer wrote:
> On Tue, Nov 23, 2010 at 09:27:53AM -0500, Chase Douglas wrote:
>> On 11/23/2010 01:07 AM, Peter Hutterer wrote:
>>> On Fri, Nov 19, 2010 at 01:52:39PM -0500, Chase Douglas wrote:
>>>> diff --git a/XI2proto.txt b/XI2proto.txt
>>>> index fd47643..6976683 100644
>>>> --- a/XI2proto.txt
>>>> +++ b/XI2proto.txt
>>>> @@ -197,11 +197,29 @@ stage is optional. Within this document, the term "touch sequence" is used to
>>>> describe the above chain of events. A client wishing to receive touch events
>>>> must register for all three touch events simultaneously.
>>>>
>>>> -Touch events are sent to all clients registered for them on any window in the
>>>> -window tree from the root window to the child window directly under the touch.
>>>> -
>>>> A touch event is not delivered according to the device hierarchy. All touch
>>>> -events are sent only through their originating slave devices.
>>>> +events are sent only through their originating slave devices. However,
>>>> +dependent touch devices will only emit touch events if they are attached to a
>>>> +master device. This is due to the touch delivery being dependent on the
>>>> +location of a cursor.
>>>
>>> it is still not clear why you don't want to route touch events through the
>>> master device (other than that leaving it in the SD alone is easier to
>>> implement). is there a list of pros and cons for both approaches?
>>>
>>> especially in the case of touch-based pointer emulation, when the master
>>> will send events caused by the slave device. so some events have to be
>>> routed either way.
>>
>> Direct touch devices will not send any touch events through master
>> pointing devices. If we send dependent touch events through the master
>> pointing devices, there will be a much larger difference in the event
>> handling for dependent and direct touch devices. We may see developers
>> write applications that handle MT on their laptop trackpads and not
>> realize they don't work on touchscreens.
>
> they'd realise soon enough when users complain about it. one of the biggest
> differences between dependent touch and direct touch devices is that
> dependent touch _augments_ an input method, while direct touch usually
> _replaces_ it. the latter, mainly because of the fat-finger problem, usually
> requires a redesign of the UI as a whole. so it's not necessarily a bad
> thing if MT developed for trackpads doesn't work on touchscreens. anyway,
> that's sidetracking a bit though.
I'll agree that this could be a moot point given that things may need to
be handled differently between the two modes anyways. I still think it
would be counterintuitive, but this point isn't a showstopper.
>> If we send events through the master device, we have to handle DCEs as
>> well. Two separate dependent touch devices may be attached to the same
>> master device. I'm trying real hard to not have to deal with DCEs for MT
>> devices :). Not only is it more surface area in the protocol to
>> implement, it presents more opportunity for implementation or protocol bugs.
>>
>> Part of the purpose of master devices is to coalesce pointer motion from
>> multiple devices into one cursor on screen. The cursor on screen has the
>> same boundaries and behavior across all attached devices. There's no MT
>> analog to relative devices so I'll leave those aside. Absolute devices
>> are transformed from device coordinates to screen coordinates. I don't
>> believe dependent touch devices should be mapped to screen coordinates;
>> if you want such behavior, make the device behave as a direct device. So
>> if dependent touch devices don't move the cursor by themselves, and they
>> have different properties such as resolution and limits, what do we gain
>> by sending them through the same master device?
>
> a few comments here:
> x/y is mapped to screen coordinates for direct devices but the
> original value is still available to clients. for dependent touch, you still
> need to provide the focus point (i.e. x/y of the cursor) in screen
> coordinates as well.
My implementation does this. For both modes of devices, the root and
event coordinates of the DeviceEvent are given in screen coordinates.
The X and Y touch valuators are given in device coordinates. Direct
touch device root and event coordinates are derived from the X and Y
touch values. Dependent touch device root and event coordinates are
copied from the attached master pointer position.
I believe this meets all needs, does it break anything if the event is
built this way?
> master devices are a multiplexer device, you're right here. what they also
> provide is a defined pairing of pointer and keyboard devices. so use the
> example of a desktop with a built-in touchscreen. whether I use my finger to
> click somewhere or the mouse shouldn't matter, the pointer follows both as a
> cursor and thus controls keyboard foci as well.
> of course, in a multi-user setup, the need for a defined pairing is even
> higher. so we have to attach any device to a MD pointer anyway at which
> point maintaining the hierarchy in the events not only provides consistency,
> but also a sequence on how the events occured if multiple slave devices are
> in use at the same time.
This seems to be an argument for having touch devices participate in the
device hierarchy. I have no issues with this. This doesn't require
sending touch events through the master pointing device though.
> also, you say that dependent touch devices don't move the cursor by
> themselves - that is not completely true, isn't it? it depends on the
> implementation, a touchpad still moves the cursor even if it supports MT and
> it could instruct the server to emulate pointer events for part of the MT
> events.
When I referred to dependent touch devices here, I meant the literal
touch class of the device. You're right that the device may have a
general valuator axis device for pointing, and that device class will
provide for single pointer emulation.
My point was that by not sending touch events through the MD we erect a
clean barrier between pointer emulation and multitouch events.
Thanks,
-- Chase
More information about the xorg-devel
mailing list