Sciweavers

521 search results - page 44 / 105
» Affective multimodal human-computer interaction
Sort
View
CHI
2008
ACM
16 years 8 days ago
Implementing eye-based user-aware e-learning
We propose an e-learning scenario where eye tracking is exploited to get valuable data about user behavior. What we look at -- as well as how we do that -- can in fact be used to ...
Marco Porta
CHI
2004
ACM
16 years 7 days ago
Connecting bridges across the digital divide
Connecting people across the Digital Divide is as much a social effort as a technological one. We are developing a community-centered approach to learn how interaction techniques ...
William D. Tucker
CHI
1999
ACM
15 years 4 months ago
Inferring Intent in Eye-Based Interfaces: Tracing Eye Movements with Process Models
While current eye-based interfaces offer enormous potential for efficient human-computer interaction, they also manifest the difficulty of inferring intent from user eye movements...
Dario D. Salvucci
CHI
1994
ACM
15 years 4 months ago
Speech dialogue with facial displays
Human face-to-face conversation is an ideal model for human-computer dialogue. One of the major features of face-to-face communication is its multiplicity of communication channel...
Akikazu Takeuchi, Katashi Nagao
CHI
2004
ACM
16 years 7 days ago
Sensing GamePad: electrostatic potential sensing for enhancing entertainment oriented interactions
This paper introduces a novel way to enhance input devices to sense a user's foot motion. By measuring the electrostatic potential of a user, this device can sense the user&#...
Jun Rekimoto, Hua Wang