Sciweavers

971 search results - page 52 / 195
» Observing users in multimodal interaction
Sort
View
113
Voted
CHI
2010
ACM
15 years 7 months ago
Knowing where and when to look in a time-critical multimodal dual task
Human-computer systems intended for time-critical multitasking need to be designed with an understanding of how humans can coordinate and interleave perceptual, memory, and motor ...
Anthony J. Hornof, Yunfeng Zhang, Tim Halverson
HCI
2009
14 years 10 months ago
Towards Cognitive-Aware Multimodal Presentation: The Modality Effects in High-Load HCI
In this study, we argue that multimodal presentations should be created in a cognitive-aware manner, especially in a high-load HCI situation where the user task challenges the full...
Yujia Cao, Mariët Theune, Anton Nijholt
HAPTICS
2005
IEEE
15 years 6 months ago
Multi-Modal Perceptualization of Volumetric Data and Its Application to Molecular Docking
In this paper, we present a multi-modal data perceptualization system used to analyze the benefits of augmenting a volume docking problem with other perceptual cues, particularly...
Ross Maciejewski, Seungmoon Choi, David S. Ebert, ...
105
Voted
VIP
2003
15 years 1 months ago
Face and Body Gesture Recognition for a Vision-Based Multimodal Analyzer
For the computer to interact intelligently with human users, computers should be able to recognize emotions, by analyzing the human’s affective state, physiology and behavior. I...
Hatice Gunes, Massimo Piccardi, Tony Jan
IUI
2003
ACM
15 years 5 months ago
Affective multi-modal interfaces: the case of McGurk effect
This study is motivated by the increased need to understand human response to video-links, 3G telephony and avatars. We focus on response of participants to audiovisual presentati...
Azra N. Ali, Philip H. Marsden