Sciweavers

587 search results - page 69 / 118
» A Gesture Interface for Human-Robot-Interaction
Sort
View
CHI
2004
ACM
16 years 6 days ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay
ICMI
2004
Springer
148views Biometrics» more  ICMI 2004»
15 years 5 months ago
Multimodal interface platform for geographical information systems (GeoMIP) in crisis management
A novel interface system for accessing geospatial data (GeoMIP) has been developed that realizes a user-centered multimodal speech/gesture interface for addressing some of the cri...
Pyush Agrawal, Ingmar Rauschert, Keerati Inochanon...
CAINE
2010
14 years 10 months ago
Scrybe: A Tablet Interface for Virtual Environments
Virtual reality (VR) technology has the potential to provide unique perspectives of data that are not possible with standard desktop hardware. The tracking devices often found wit...
Roger V. Hoang, Joshua Hegie, Frederick C. Harris ...
IWC
2007
121views more  IWC 2007»
14 years 11 months ago
Eye movements as indices for the utility of life-like interface agents: A pilot study
We motivate an approach to evaluating the utility of life-like interface agents that is based on human eye movements rather than questionnaires. An eye tracker is employed to obta...
Helmut Prendinger, Chunling Ma, Mitsuru Ishizuka
ASSETS
2010
ACM
14 years 10 months ago
Towards accessible touch interfaces
Touch screen mobile devices bear the promise of endless leisure, communication, and productivity opportunities to motor-impaired people. Indeed, users with residual capacities in ...
Tiago João Guerreiro, Hugo Nicolau, Joaquim...