Sciweavers

971 search results - page 65 / 195
» Observing users in multimodal interaction
Sort
View
100
Voted
CHI
1999
ACM
15 years 4 months ago
Inferring Intent in Eye-Based Interfaces: Tracing Eye Movements with Process Models
While current eye-based interfaces offer enormous potential for efficient human-computer interaction, they also manifest the difficulty of inferring intent from user eye movements...
Dario D. Salvucci
115
Voted
ISWC
2002
IEEE
15 years 5 months ago
Personalized Augmented Reality Touring of Archaeological Sites with Wearable and Mobile Computers
: This paper presents ARCHEOGUIDE, a novel system offering augmented reality tours in archaeological sites. The system is based on wearable and mobile computers, networking technol...
Vassilios Vlahakis, John Karigiannis, Manolis Tsot...
130
Voted
ICMI
2010
Springer
213views Biometrics» more  ICMI 2010»
14 years 10 months ago
Toward natural interaction in the real world: real-time gesture recognition
Using a new hand tracking technology capable of tracking 3D hand postures in real-time, we developed a recognition system for continuous natural gestures. By natural gestures, we ...
Ying Yin, Randall Davis
CHI
2008
ACM
16 years 28 days ago
Reality-based interaction: a framework for post-WIMP interfaces
We are in the midst of an explosion of emerging humancomputer interaction techniques that redefine our understanding of both computers and interaction. We propose the notion of Re...
Robert J. K. Jacob, Audrey Girouard, Leanne M. Hir...
ICMI
2004
Springer
148views Biometrics» more  ICMI 2004»
15 years 6 months ago
A framework for evaluating multimodal integration by humans and a role for embodied conversational agents
One of the implicit assumptions of multi-modal interfaces is that human-computer interaction is significantly facilitated by providing multiple input and output modalities. Surpri...
Dominic W. Massaro