Sciweavers

971 search results - page 104 / 195
» Observing users in multimodal interaction
Sort
View
124
Voted
ICMI
2007
Springer
222views Biometrics» more  ICMI 2007»
15 years 6 months ago
Evaluation of haptically augmented touchscreen gui elements under cognitive load
Adding expressive haptic feedback to mobile devices has great potential to improve their usability, particularly in multitasking situations where one’s visual attention is requi...
Rock Leung, Karon E. MacLean, Martin Bue Bertelsen...
95
Voted
OZCHI
2006
ACM
15 years 6 months ago
LookPoint: an evaluation of eye input for hands-free switching of input devices between multiple computers
We present LookPoint, a system that uses eye input for switching input between multiple computing devices. LookPoint uses an eye tracker to detect which screen the user is looking...
Connor Dickie, Jamie Hart, Roel Vertegaal, Alex Ei...
HT
2005
ACM
15 years 6 months ago
A tactile web browser for the visually disabled
The dissemination of information available through the World Wide Web makes universal access more and more important and supports visually disabled people in their everyday life. ...
Martin Rotard, Sven Knödler, Thomas Ertl
104
Voted
CW
2003
IEEE
15 years 6 months ago
Disappearing Computers, Social Actors and Embodied Agents
Presently, there are user interfaces that allow multimodal interactions. Many existing research and prototype systems introduced embodied agents, assuming that they allow a more n...
Anton Nijholt
89
Voted
HAID
2008
Springer
15 years 1 months ago
Evaluation of Continuous Direction Encoding with Tactile Belts
Tactile displays consisting of tactors located around the user’s waist are a proven means for displaying directions in the horizontal plane. These displays use the body location ...
Martin Pielot, Niels Henze, Wilko Heuten, Susanne ...