[PATCH xi2.1 inputproto] Many more updates to the XI 2.1 protocol

Peter Hutterer peter.hutterer at who-t.net
Mon Apr 11 18:51:23 PDT 2011


On Mon, Apr 11, 2011 at 05:40:56PM -0400, Chase Douglas wrote:
> On 04/08/2011 12:17 AM, Peter Hutterer wrote:
> > On Tue, Mar 22, 2011 at 03:07:34PM -0400, Chase Douglas wrote:
> >> On 03/18/2011 02:23 AM, Peter Hutterer wrote:
> >>> On Thu, Mar 10, 2011 at 03:47:41PM -0500, Chase Douglas wrote:
> >>> * it is not clear whether a passive grab can be established for owner-only.
> >>>   does a grab always require the ownership semantics or can there be the
> >>>   default TouchBegin semantics too? If so, how is this decided?
> >>
> >> The implementation so far requires that a grabber use unowned event
> >> semantics. 
> > 
> > does the _protocol_ require it?
> 
> I don't believe the protocol requires it. We can change the protocol to
> allow for it. I think I hit an issue with the implementation I had, but
> I don't think it was a protocol issue.
> 
> >> A client cannot grab for owner-only events. This is partly
> >> because it's easier and partly because I think touch grabbing clients
> >> either will need the functionality or will be able to handle it easily.
> >>
> >> For example, say you have a touch-aware window manager. Generally the WM
> >> only needs to focus a window that's been interacted with or move the
> >> window if you interact with the title bar. Both scenarios are usually
> >> state driven rather than path driven. The WM can raise the window and
> >> can keep track of just the last touch location to move the window when
> >> it becomes the owner.
> > 
> > counter-example: popup window. i don't care what happens before I get an
> > event that's really mine, so why would I listen to it.
> 
> Fair enough.
> 
> >>> * it seems impossible to be an observer that gets ownership. for a situation
> >>>   where you are below the accepting client, this means you're stuck. you can
> >>>   either observe (and never get ownership) or not observe (and lose those
> >>>   touch sequence the parent accepts)
> >>
> >> Can't the client have a passive grab and then reject ownership but
> >> continue receiving events as an observer?
> > 
> > for observing, you get all events. if you're not observing, if a client
> > above you accepts the grab, you get a TouchEnd so you never get to opt for
> > RejectObserve.
> 
> What do you think of adding GrabTypeTouchBeginAndObserve? If you receive
> ownership, it acts like a GrabTypeTouchBegin, and if a client above you
> accepts, it acts like a GrabTypeObserve.

[copying comment from further down]

> As for raw touch events, I don't think they are needed at all. The
> current raw events provide two values for each valuator axis:
> transformed values, and untransformed values, both in device
> coordinates. The valuator axes for XI 2.1 touch events are already
> provided in device coordinates. There also is no clipping nor
> acceleration performed on touch events. Thus, I believe raw touch events
> are unnecessary.

let's take a step back again: the problem observe events intend to cover is
that events are _always_ delivered to the client, regardless of grabs.
the three potential interactions with grabs are:
- observing events before the client owns the grab
- observing events while the client owns the grab
- observing events after the client has rejected the grab.

And a combination thereof. Point 2 is also served by normal events, but for
completeness sake, it's listed here.

The basic point of grabs is to deliver events exclusively to one client,
regardless of the location of the input event. For observing touch events,
grab merely serve as triggers for when to receive events as what others do
with the grab has little effect.

RawEvents already do exactly that - they send events to the client
regardless of whether the event is delivered to anyone else. They're not
delivered during grabs but that's one of the things I want to fix anyway.

So my thought towards RawEvents is that by adding three event types, we can get
the same effect of *Observe*, without messing up the grab semantics to a
point where even we struggle to get all of them into our brains.

> >>> * the TouchObserve grab concept as-is doesn't quite fit. the idea originally
> >>>   came up to solve the problem of a grabber that wants to keep getting
> >>>   events after rejecting. That is what TouchRejectObserve does. The grab
> >>>   mode TouchObserve doesn't fit. if you can't become owner anyway, this is
> >>>   hardly different to RawEvents so we could use those.
> >>
> >> I'm hopeful that my last comment would suffice here too :)
> >>
> >>>   which brings me to
> >>> * no RawEvents for touches? would be a good time to fix the broken grab
> >>>   semantics for raw events too (deliver them even if a grab is present)
> >>
> >> We could add them now, or we could push them off to XI 2.2. I'm
> >> beginning to get worried about landing XI 2.1 in xserver 1.11. I haven't
> >> really thought about raw touch events either. I'd rather let the idea
> >> percolate and/or wait for a use case for them to be more clear.
> > 
> > planning for 2.2 is hard. I remember pushing off features for 2.1 and then
> > got sidetracked with a million other things. so I'm somewhat hesitant to
> > have features pushed off for 2.2 (as opposed to "planned for 2.2". how much
> > difference one word can make...)
> > I'm all too worried that once the first version is out everyone moves on to
> > whatever the next shiny thing on the horizon is.
> 
> Isn't that how development works though? If someone needs functionality,
> they'll push for it. If no one needs it, then maybe it doesn't need to
> go in. I have no need for anything beyond what has been worked on for XI
> 2.1, so I feel I'm in a poor position to implement anything else. I may
> get it wrong :).
 
development != API design. The inertia of the X protocol is quite bad.  Some
of the things the GTK guys have to do for XI2 support aren't pretty and they
only came up 2 years after the relase.

It's usually easier to add things that are missing, but harder to change
things that shouldn't be.

> As for raw touch events, I don't think they are needed at all. The
> current raw events provide two values for each valuator axis:
> transformed values, and untransformed values, both in device
> coordinates. The valuator axes for XI 2.1 touch events are already
> provided in device coordinates. There also is no clipping nor
> acceleration performed on touch events. Thus, I believe raw touch events
> are unnecessary.
> 
> >>> * If SemiMultitouch isn't good enough to give you touch points, how is the
> >>>   pointer emulation decided?
> >>
> >> The X input module still generates the pointer events. How it does this
> >> is implementation specific. However, SemiMultitouch is limited to
> >> trackpads. Trackpads are also limited in that they only generate pointer
> >> motion events when only one touch is active.
> > 
> > SemiMultitouch is _currently_ limited to trackpads. I wouldn't put it past
> > hardware vendors to ignore this fine assumption and come out with some other
> > device that doesn't do multitouch propertly.
> > but anyway, something like "for devices of type SemiMultitouch, pointer
> > emulation is implementation dependent" may be useful to clear up confusion.
> 
> Fine with me.
> 
> >>> * as pointed out in the other email, I'm still confused on whether the
> >>>   master device delivers touch events. this isn't mentioned in the spec but
> >>>   the last patchset I reviewed skipped touch events from the master device
> >>
> >> I'd just forget any XI 2.1 implementation patchsets you've seen so far
> >> :). The real stuff will look very different.
> >>
> >> The master device still delivers touch events from attached slave
> >> devices. Here's an example:
> >>
> >> * Slave device S is attached to master device M
> >> * Client A grabs touch events on the root window R on slave device S
> >> * Client B selects for all touch events (including unowned) on top level
> >> window W on master device M
> >>
> >> When the user touches on window W, touch events are sent with device id
> >> of S to client A. Touch events are also sent with device id of M to
> >> client B (though the slave id is set to the id of S).
> >>
> >> This allows for one client to select for all master devices and not
> >> receive events from a floating slave device.
> > 
> > ok, so we're using normal XI2 semantics. fine with me then, though Daniel
> > expressed some issues with that.
> 
> I think our initial trepidation has been resolved. I hope Daniel feels
> the same way :).
> 
> >>> * in the touch event delivery section, the sentence "No touches from an
> >>> * indirect device may begin while the
> >>>   device is floating, as it does not have an associated pointer position to
> >>>   focus events." This is incorrect. Run XIQueryPointer on the floating
> >>>   slave, you'll see it does have a pointer position. In fact, it even has a
> >>>   sprite, it's just not rendered.
> >>>   this is in-line with XI1 where clients are expected to manually
> >>>   render the cursor.
> >>
> >> What do you suggest as a resolution? Better wording so it matches what
> >> is described, or that we actually send touch events to the pointer
> >> position of the floating device?
> > 
> > the latter, unless there's a good reason not to. we already expect clients
> > to know what floating slaves are, so if they select for events from them we
> > can assume they know what they're doing.
> 
> Ok. I'm not sure I agree, but I don't mind much either way :).
> 
> >>> * pointer event handling for indirect devices: what was the motivation for
> >>>   withholding touch events when the pointer leaves the window again? 
> >>>   It isn't clear what "withheld" means, even if you have the ownership
> >>>   selection, you still don't get it? this whole concept feels a bit tacked
> >>>   on the side and breaks ownership assumptions (that touch events are
> >>>   delivered immediately if you can cope with ownership changes).
> >>
> >> It is hacky and tacked on the side. The problem it resolves is when a
> >> client has a touch selection on a window, is receiving touch events from
> >> an indirect device, and the cursor then moves outside the window. The
> >> pointer button may then be pressed, activating an implicit grab over a
> >> different window. Now, you've got pointer events going to the new window
> >> and touch events being sent to the original window.
> >>
> >> There's a parallel issue when you have touch events from an indirect
> >> device going to the first window and then a pointer grab is activated on
> >> a different window.
> >>
> >> I think the most reasonable solution is to have the client ignore touch
> >> events when the pointer "leaves" the selection/grab window. The two
> >> approaches I've come up with are:
> >>
> >> 1. Leave it up to the client and tell them to ignore touches when they
> >> receive a leave notification event.
> >> 2. Enforce it in the server by withholding events while the pointer is away.
> >>
> >> Option 2 is easier to implement on the client side and less likely to
> >> cause buggy behavior. This is what I described in the protocol. I'd be
> >> fine with option 1 though, too.
> > 
> > this leaves me with one problem: I can't perform a gesture to virtually lift
> > an item, move the cursor elsewhere and then drop it. Because anything I do
> > once the cursor moves outside the window will be thrown away ("withheld").
> > I don't quite see where the problem is (from a UI POV) to have touch events
> > go to two differnet windows provided the touch grab was activated first.
> 
> I suppose this is like option 1 without explicitly saying in the
> protocol document what the client "should" do. I don't think it will
> cause buggy behavior in most cases, I just worry about the outlying cases.
> 
> This is a tricky enough area that OS X is buggy when you try to do this.
> I think we'll be even better off, so perhaps it's just not worth losing
> any sleep over.
> 
> > if you really want to avoid this situation, disallow pointer grabs from
> > other clients while touch grabs are active. that may be a sensible solution.
> 
> It's not just touch grabs. It's also touch selections. Does a touch
> selection inhibit pointer grabs from activating? That would break the
> use case of dragging the cursor over to a different window and then
> clicking without lifting the finger from the touchpad.
> 
> >>> * it is unclear if a client can grab for pointer + touch, both actively and
> >>>   passively. this is an issue for dependent device, especially because touch
> >>>   events are withheld during pointer grabs.
> >>>   I think the event mask should be honoured during grabs if valid.
> >>
> >> I hope the above explanation provides enough detail for how I think this
> >> should work. Let me know if you still think there's an issue.
> > 
> > not quite, different question. my question was - what happens if I call
> > XIGrabDevice(XITouchBegin | XIButtonPress), i.e. the grab mask selects for
> > both pointer and touch events.
> 
> If we go with what you suggested, where touch events do not affect
> pointer events from a dependent device, then I would say that concept
> holds here too. The grabbing client would receive a grab for both the
> touch sequence and the pointer motion events.
> 
> >>> * a device that emulates pointer events must also initialize valuators
> >>>   though they may serve no other purpose. should there be a mapping field in
> >>>   the TouchAxisClass to say which valuator this one maps to in case of
> >>>   pointer emulation?
> >>>   for a client it may be important to know that the axis a device advertises
> >>>   is just there because of pointer emulation.
> >>>   same goes for the button class, you need one to send button events.
> >>
> >> Lets say a touch device provides pressure information. I think the
> >> pointer device should have a valuator for pressure information too, and
> >> when a touch is emulated the valuator data should be valid.
> >>
> >> This is not handled by the X server in any implementation I'm aware of.
> >> However, I see this as an implementation issue, not a protocol issue. If
> >> the valuator axis is listed by the device, then the client should assume
> >> the data exists and will be set appropriately. To do this will require
> >> the proper input module interface, but shouldn't be dependent on any
> >> protocol changes.
> > 
> > for a dependent device, if I select for both touch and pointer events, how
> > do I figure out which data is duplicated then?
> > if you want to go into the fine details - if pressure isn't emulated
> > because I'm a touch client and I thus don't get emulation, the pressure
> > valuator never sends events. I don't get any indication _why_ this is the
> > case though since I cannot tell whether this axis actually exists or is just
> > there for of pointer emulation..
> 
> I'm sorry, I've tried to understand this comment, but I can't seem to
> grasp the intent :). Maybe I need it spelled out in a different way?

oh, sorry. I'll try again.
Assume a with real x, real y, touch x, touch y, touch pressure. Pointer
emulation requires this device to have 6 axes:
- valuator x, y, pressure
- touch x, y, pressure

As a client, I don't know whether the valuator pressure axis is a real axis
or a pointer emulation-only axis. I register for touch _and_ pointer events
from this device. Because I selected for touch, no pointer emulation is
performed.
Thus, the data I can get from the device are valuator x, y and touch x, y,
pressure. Valuator pressure will never send events but I have no indication
why.   

does this make sense now?

> >>> * in the pointer emulation section the protocol states that "Both the
> >>>   emulated pointer events and their associated touch events will have the
> >>>   PointerEmulated flag set." That is after a paragraph that states that
> >>>   pointer and touch events are mutually exclusive. so you get either pointer
> >>>   or a touch event with that flag but you cannot get the other.
> >>>   that seems rather pointless. if you select for touch events, you won't get
> >>>   the emulated events anyway and if you select for pointer events you never
> >>>   see the touch events. so I'm wondering what that flag conveys other than
> >>>   the server saying "hey, I know more than you about this event" :)
> >>
> >> For indirect devices you may receive both pointer and motion events. It
> >> is only for direct devices where you have emulated pointer events that
> >> you will receive events for one type but not both.
> > 
> > so the gist of this seems to leave us with two outcomes. On direct devices
> > we cannot get both anyway, so let's ignore them. on dependent devices a
> > client may either
> > - register for touch and receive touch events with the flag set. that client
> >   will never see the pointer events with the flag set but may see pointer
> >   events without the flag.
> 
> I think you're asking if the client registers for both pointer and touch
> events. In this case, for the touch that generates pointer events the
> client will receive both a touch sequence and a pointer motion sequence.
> The PointerEmulated flag will be set for the events from both of these
> sequences.

this seems to be a misunderstanding then. So we do pointer emulation for
touch clients as well? my last memory of this was that we don't emulate for
touch client.

Cheers,
  Peter

> > - register for pointer events only and see pointer events with _and_ without
> >   the flag, depending on whether emulation is active. 
> 
> If there's no emulation, there would be no pointer events at all. Thus,
> all events would have pointer emulation
> 
> (In the above, I'm assuming we're leaving aside any input from other
> devices attached to the server.)


More information about the xorg-devel mailing list