Sciweavers

435 search results - page 16 / 87
» Interacting with the Computer Using Gaze Gestures
Sort
View
CHI
2007
ACM
15 years 10 months ago
BodySpace: inferring body pose for natural control of a music player
We describe the BodySpace system, which uses inertial sensing and pattern recognition to allow the gestural control of a music player by placing the device at different parts of t...
Steven Strachan, Roderick Murray-Smith, M. Sile O'...
HCI
2009
14 years 7 months ago
An Open Source Framework for Real-Time, Incremental, Static and Dynamic Hand Gesture Learning and Recognition
Real-time, static and dynamic hand gesture learning and recognition makes it possible to have computers recognize hand gestures naturally. This creates endless possibilities in the...
Todd C. Alexander, Hassan S. Ahmed, Georgios C. An...
CHI
2011
ACM
14 years 1 months ago
PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays
Flexible displays potentially allow for interaction styles that resemble those used in paper documents. Bending the display, e.g., to page forward, shows particular promise as an ...
Byron Lahey, Audrey Girouard, Winslow Burleson, Ro...
CHI
2005
ACM
15 years 10 months ago
A study on the use of semaphoric gestures to support secondary task interactions
We present results of a study that considers (a) gestures outside the context of a specific implementation and (b) their use in supporting secondary, rather than primary tasks in ...
Maria Karam, Monica M. C. Schraefel
CHI
2007
ACM
15 years 10 months ago
GUIDe: gaze-enhanced UI design
The GUIDe (Gaze-enhanced User Interface Design) project in the HCI Group at Stanford University explores how gaze information can be effectively used as an augmented input in addi...
Manu Kumar, Terry Winograd