Multimodal affect recognition in learning environments

9 years 8 months ago
Multimodal affect recognition in learning environments
We propose a multi-sensor affect recognition system and evaluate it on the challenging task of classifying interest (or disinterest) in children trying to solve an educational puzzle on the computer. The multimodal sensory information from facial expressions and postural shifts of the learner is combined with information about the learner’s activity on the computer. We propose a unified approach, based on a mixture of Gaussian Processes, for achieving sensor fusion under the problematic conditions of missing channels and noisy labels. This approach generates separate class labels corresponding to each individual modality. The final classification is based upon a hidden random variable, which probabilistically combines the sensors. The multimodal Gaussian Process approach achieves accuracy of over 86%, significantly outperforming classification using the individual modalities, and several other combination schemes. Categories and Subject Descriptors I.5 [Computing Methodologies...
Ashish Kapoor, Rosalind W. Picard
Added 26 Jun 2010
Updated 26 Jun 2010
Type Conference
Year 2005
Where MM
Authors Ashish Kapoor, Rosalind W. Picard
Comments (0)