Multi{pointer,touch} & Userspace
Jim Gettys
jg at laptop.org
Wed Oct 29 08:20:53 PDT 2008
I worry about the use of UDP, at least for networked input devices.
Losing some of the event stream (e.g. up events) can be very confusing
to applications (not to mention device drivers). Wireless is unreliable;
UDP does not guarantee reliability. Latency with TCP is just as low
(and bandwidth comparable) in the face of no packet loss (if TCP_NODELAY
is set on the socket).
TCP resumption of transmission in the face of (occasional) packet loss
is related to the observed RTT IIRC; in a local environment, these
latencies are very low.
I've gotten quite paranoid on this topic, living as I do in an RF
"harsh" environment.
The OctoPocus work presented at UIST last week was very interesting,
where they presented a discoverable gesture recognizer. It prompts the
user with possible gestures and shows you what gestures are possible as
you follow a path. Such recognizers don't belong in the window system
IMHO, but the window system sure needs to support such recognizers.
I've also been wondering about the relative merit its of an ad-hoc wire
format (such as you've defined) for events versus something like XML,
where parsers and extensibility is known. But it's time for me to
perform some experiments to understand the performance trade-offs,
rather than hand-waving; your existing code trumps my unproven
hypothesis, so I better write some code.
Say hi to Gudrun for me...
- Jim
On Wed, 2008-10-29 at 08:51 +0100, Florian Echtler wrote:
> Hello everybody,
>
> a while ago, after the MPX merge, there was a brief discussion about
> multipointer support for userspace. I'd like to revive this discussion,
> and I have some bits to restart it:
>
> - At http://tisch.sf.net/ , you can find the first beta release of our
> multitouch development framework. While this may not be of im mediate
> interest to you, there's two important points which I'd like to
> mention:
>
> - There are two possible ways to connect MPX to this framework:
> a) as a backend, which delivers user input by way of pointer events
> (or maybe later blob events)
> b) as a frontend, which receives input data from the framework and
> uses that to control pointers
> Both ways are already implemented and will shortly be merged into
> our framework.
>
> - If I remember correctly, there was a short discussion about a
> generic gesture recognition engine. I had plans to build such a
> thing for over a year now, and what is called "interpretation layer"
> in the paper & diagram) is exactly that. I'd kindly ask you to have
> a look at http://tisch.wiki.sourceforge.net/EventProtocol , where
> the inner workings of the recognizer are described briefly, and tell
> me what you think.
>
> - During his diploma thesis, a student of mine wrote an MPX patch for
> FreeGLUT. I just noticed that the Xorg git tree also contains a GLUT
> library.. so should I try to port our patch to this GLUT version, or
> rather submit it to FreeGLUT? I believe that FreeGLUT is more widely
> used; so maybe both?
>
> Well, that's all from me for now, so now I'm looking forward to hear
> your opinions.
>
> Thanks, Yours, Florian
>
> P.S. I know that the Xserver is technically also a part of userspace;
> the subject was rather meant in the abstract sense.
> _______________________________________________
> xorg mailing list
> xorg at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/xorg
--
Jim Gettys <jg at laptop.org>
One Laptop Per Child
More information about the xorg
mailing list