<br><br><div class="gmail_quote">On Sun, May 11, 2008 at 11:17 PM, Peter Hutterer <<a href="mailto:peter@cs.unisa.edu.au">peter@cs.unisa.edu.au</a>> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div class="Ih2E3d">On Sat, May 10, 2008 at 09:20:11AM +0900, Jordi Polo wrote:<br>
> I am interested in bringing multitouch to normal applications, obvious<br>
> examples are multitouch gestures for copy, paste, the famous zoom, etc.<br>
> As I am mostly a KDE guy, I planned to create new Qt Events from<br>
> multitouch. They will add to the current QKeyEvent QMouseMoveEvent,<br>
> etc.<br>
> AFAIK, MPX multi-mouse, multi-keyboard can not be used for multitouch<br>
> gestures (a gesture is what you do in iphone touch or mac air for<br>
> zooming, etc.). At least the youtube demos I've seen so far do not use<br>
> it like that. Gestures basically links the cursors together, so the<br>
> user has no different cursors but one cursor doing multiple things at<br>
> the same time. I wonder if gestures can be create above MPX mouse<br>
> events and forwarded to apps (X should be the best place for this<br>
> also).<br>
<br>
</div>I thought about how how to get multitouch support into the X server last year<br>
when I was experimenting with it.<br>
<br>
Basically, gestures do not belong into the X server. They are<br>
contextually-dependent interpretation of hardware events. i.e the server<br>
doesn't know when two cursors are to be interpreted as a two-finger scrolling<br>
gesture and when it is just two people doing something in close vincinity.<br>
<br>
IMHO, the only way to get the gestures you want is to put them into a<br>
client-side library. Only there can you get the contextual information to<br>
interpret the gestures correctly.<br>
<div class="Ih2E3d"></div></blockquote><div><br>I agree that gestures should belown to a client library<br>Someone in the kde-core list pointed to your presentation in Linux-conf last year. I wrote the first email before knowing that you already did some development with blob events. At that time I thought a multitouch panel will appear as several cursors in MPX. <br>
Having a multitouch specific development simplifies things, that is what I need. <br>To further clarify it, I will not create gestures from multiple cursor devices but from single multitouch devices.<br>The basic idea is take the events from X, convert them to QEvents, give them to a library to create gestures and send gestures and events that wasn't recognized as gestures to KDE apps. <br>
<br></div><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;"><div class="Ih2E3d"><br>
> Basically I was planning using a multitouch library (touchlib) that<br>
> already gives events using the TUIO protocol and create QEvents based<br>
> on those.<br>
> If it can be done with MPX in a more standard way, we may try that<br>
> instead.<br>
<br>
</div>MPX will get multitouch-screen support, but not in this upcoming version. Mind<br>
you, this is not the same as multi-touch gesture support.</blockquote><div><br>Then, I need next to next version :D<br>The current plan is, then, mergue multi-devices in 1.5 and multitouch in 1.6? <br>What devices if any are already working with MPX multitouch?<br>
<br>If so, I may create a TUIO to MPX blob events bridge library and build everything on MPX blob events, knowing that future 1.6 multitouch will be similar to that.<br> </div></div><br>-- <br>Jordi Polo Carres<br>NLP laboratory - NAIST<br>
<a href="http://www.bahasara.org">http://www.bahasara.org</a><br>