GSoC Proposal the second

janikjaskolski at aol.com janikjaskolski at aol.com
Fri Mar 25 15:32:16 PDT 2011


Hello Michal, 


first of all, thanks a lot for your feedback. 

I must admit though, that I get the feeling, that you did either not fully comprehend my intention, 
or else, have not read some of the latest research results on multi sensory input evaluation?!
Maybe I just expressed myself not well enough.

> For multitouch tablets this is already well covered by multitouch gestures.

I should have made it clear, that I do not work on multi touch approaches, I apologize for that.
(Even though multi touch would profit as well ^^)

> For single-touch devices some mechanism could be handy. Multiple clicks/knocks aren't very useful. 
> The distinction between click and doubleclick (multiclick) is usually reasonably clear but it is easy to 
> confuse different multiclicks (2, 3, more) due to user error or input analysis error.

Actually, through correlation the touch events with sound recognition, confusion is reduced to almost
nonexistence, since touching the screen on its own, does not trigger anything but normal clicks. Only
if further information is at hand (analyzed audio feeds) different events would be triggered.

> However, if you can detect the user moving their finger towards oraway from the microphone over a 
> surface already present on the deviceor added for the purpose you could get additional input that 
> could beused for scrolling or zooming.

This would actually be very challenging. An audio analysis that is detailed enough to do that, would 
most likely produce delays that are not acceptable with input events. If this is possible, it would 
present some interesting possibilities, you are right.

> The problem with adding other clicks is that touchdown is the only wayto move the pointer but 
> also a click already. In absence of multitouchadding a right-click is challenging. 
> Moving the pointer already causesleft-click. Additional button or other input can technically 
> produce aright-click but you would not get any coordinates associated with itso it's not very useful.

I'm not sure if I understand the problem you see here. evdev incorporates a method 
EvdevProcessButtonEvent, which queries a number of filters as to if some special event is to be 
triggered. Only if those filters return false, a normal key event is triggered. Thus I have all the information
about coordinates / click position I would need (The emulation of MB3 works exactly how I intend to 
attack the problem).

> Apple chose to emulate right click with long click (button down andhold without moving the pointer) 
> or using a modifier (eg. to generatea right click you hold down another button and then left click)
> whichare probably the only reasonable options.
> The other option is to redesign the interface to not use a right-click(and middle click) at all.
> This is not so difficult with touch interfaces as using controlsspread across the screen on various toolbars is cheap. 

I own an ipad, the integrated controls feel (for me) crippling at best. If the alternative to further control
elements is clustering the interface with buttons and / or hiding functionality under random surfaces 
one must either read up on or touch by accident to find out what it is, I would rather do research on 
different methods of input. The new ipad comes with an integrated microphone :D ....

> I don't think scratching the screen is distinguishable from justmoving the pointer.Under less than 
> ideal conditions using the stylus results in variousodd sounds without any intent, and background 
> noise would likelyinterfere with distinguishing less prominent sound variations.

I see that this may appear to be true, but if you set the properties of your microphone properly, you
could actually scream your lungs out and the spike detector wouldn't even bother to hiccup. 
While with the same settings, you scratch the surface of the laptop, perfect spikes are generated. 
(I tested that myself a while ago)

I even did a stress test, tapping my screen while driving close to a construction side where jack-hammers
where used (with an open window). My algorithm could clearly detect every tap I did. The noise did 
hardly register at all. It all depends on finding the right setting for microphone gain.

Again, the distinction between moving the pointer and scratching is the additional info of sound analysis. 
Otherwise, you would be right, there would be no stable way to detect a difference.

If you are very interested in this subject, the Natural User Interface Group has some really awesome 
approaches they currently work on. 


I hope I did a better job expressing myself this time. 

Best regards, 

Janik

p.s. I hope, that the html errors were due to the fact that I copied parts of my first email from an 
odt file. I hope this time no errors occur.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.x.org/archives/xorg-devel/attachments/20110325/b67f30d7/attachment-0001.htm>


More information about the xorg-devel mailing list