Sciweavers

587 search results - page 41 / 118
» A Gesture Interface for Human-Robot-Interaction
Sort
View
CHI
2005
ACM
16 years 7 days ago
Experimental analysis of mode switching techniques in pen-based user interfaces
Inking and gesturing are two central tasks in pen-based user interfaces. Switching between modes for entry of uninterpreted ink and entry of gestures is required by many pen-based...
Yang Li, Ken Hinckley, Zhiwei Guan, James A. Landa...
ICCV
2005
IEEE
15 years 5 months ago
Tracking Body Parts of Multiple People for Multi-person Multimodal Interface
Although large displays could allow several users to work together and to move freely in a room, their associated interfaces are limited to contact devices that must generally be s...
Sébastien Carbini, Jean-Emmanuel Viallet, O...
CHI
2008
ACM
16 years 7 days ago
Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces
We explore the feasibility of muscle-computer interfaces (muCIs): an interaction methodology that directly senses and decodes human muscular activity rather than relying on physic...
T. Scott Saponas, Desney S. Tan, Dan Morris, Ravin...
VCIP
2000
196views Communications» more  VCIP 2000»
15 years 1 months ago
MIME: a gesture-driven computer interface
MIME (Mime Is Manual Expression) is a computationally efficient computer vision system for recognizing hand gestures. The system is intended to replace the mouse interface on a st...
Daniel Heckenberg, Brian Lovell
GW
2005
Springer
173views Biometrics» more  GW 2005»
15 years 5 months ago
Deixis: How to Determine Demonstrated Objects Using a Pointing Cone
Abstract. We present an collaborative approach towards a detailed understanding of the usage of pointing gestures accompanying referring expressions. This effort is undertaken in t...
Alfred Kranstedt, Andy Lücking, Thies Pfeiffe...