Sciweavers

111
Voted
ICMI
2004
Springer
189views Biometrics» more  ICMI 2004»
15 years 2 months ago
A multimodal learning interface for sketch, speak and point creation of a schedule chart
We present a video demonstration of an agent-based test bed application for ongoing research into multi-user, multimodal, computer-assisted meetings. The system tracks a two perso...
Edward C. Kaiser, David Demirdjian, Alexander Grue...
ICMI
2004
Springer
281views Biometrics» more  ICMI 2004»
15 years 2 months ago
Articulatory features for robust visual speech recognition
Visual information has been shown to improve the performance of speech recognition systems in noisy acoustic environments. However, most audio-visual speech recognizers rely on a ...
Kate Saenko, Trevor Darrell, James R. Glass
ICMI
2004
Springer
179views Biometrics» more  ICMI 2004»
15 years 2 months ago
Visual touchpad: a two-handed gestural input device
This paper presents the Visual Touchpad, a low-cost vision-based input device that allows for fluid two-handed interactions with desktop PCs, laptops, public kiosks, or large wall...
Shahzad Malik, Joseph Laszlo
ICMI
2004
Springer
263views Biometrics» more  ICMI 2004»
15 years 2 months ago
Analysis of emotion recognition using facial expressions, speech and multimodal information
The interaction between human beings and computers will be more natural if computers are able to perceive and respond to human non-verbal communication such as emotions. Although ...
Carlos Busso, Zhigang Deng, Serdar Yildirim, Murta...
ICMI
2004
Springer
215views Biometrics» more  ICMI 2004»
15 years 2 months ago
Bimodal HCI-related affect recognition
Perhaps the most fundamental application of affective computing would be Human-Computer Interaction (HCI) in which the computer is able to detect and track the user’s affective ...
Zhihong Zeng, Jilin Tu, Ming Liu, Tong Zhang, Nich...
Biometrics
Top of PageReset Settings