We describe the BodySpace system, which uses inertial sensing and pattern recognition to allow the gestural control of a music player by placing the device at different parts of t...
Steven Strachan, Roderick Murray-Smith, M. Sile O'...
Real-time, static and dynamic hand gesture learning and recognition makes it possible to have computers recognize hand gestures naturally. This creates endless possibilities in the...
Todd C. Alexander, Hassan S. Ahmed, Georgios C. An...
Flexible displays potentially allow for interaction styles that resemble those used in paper documents. Bending the display, e.g., to page forward, shows particular promise as an ...
We present results of a study that considers (a) gestures outside the context of a specific implementation and (b) their use in supporting secondary, rather than primary tasks in ...
The GUIDe (Gaze-enhanced User Interface Design) project in the HCI Group at Stanford University explores how gaze information can be effectively used as an augmented input in addi...