Sciweavers

971 search results - page 69 / 195
» Observing users in multimodal interaction
Sort
View
93
Voted
ICCS
2004
Springer
15 years 6 months ago
Collaborative Integration of Speech and 3D Gesture for Map-Based Applications
QuickSet [6] is a multimodal system that gives users the capability to create and control map-based collaborative interactive simulations by supporting the simultaneous input from ...
Andrea Corradini
OZCHI
2006
ACM
15 years 6 months ago
Learning from interactive museum installations about interaction design for public settings
This paper reports on the evaluation of a digitallyaugmented exhibition on the history of modern media. We discuss visitors’ interaction with installations and corresponding int...
Eva Hornecker, Matthias Stifter
84
Voted
CHI
2008
ACM
16 years 29 days ago
Implementing eye-based user-aware e-learning
We propose an e-learning scenario where eye tracking is exploited to get valuable data about user behavior. What we look at -- as well as how we do that -- can in fact be used to ...
Marco Porta
89
Voted
CHI
2004
ACM
16 years 27 days ago
Connecting bridges across the digital divide
Connecting people across the Digital Divide is as much a social effort as a technological one. We are developing a community-centered approach to learn how interaction techniques ...
William D. Tucker
91
Voted
ETRA
2008
ACM
85views Biometrics» more  ETRA 2008»
15 years 2 months ago
Integrated speech and gaze control for realistic desktop environments
Nowadays various are the situations in which people need to interact with a Personal Computer without having the possibility to use traditional pointing devices, such as a keyboar...
Emiliano Castellina, Fulvio Corno, Paolo Pellegrin...