Sciweavers

Share
ACII
2005
Springer

Gesture-Based Affective Computing on Motion Capture Data

9 years 9 months ago
Gesture-Based Affective Computing on Motion Capture Data
Abstract. This paper presents research using full body skeletal movements captured using video-based sensor technology developed by Vicon Motion Systems, to train a machine to identify different human emotions. The Vicon system uses a series of 6 cameras to capture lightweight markers placed on various points of the body in 3D space, and digitizes movement into x, y, and z displacement data. Gestural data from five subjects was collected depicting four emotions: sadness, joy, anger, and fear. Experimental results with different machine learning techniques show that automatic classification of this data ranges from 84% to 92% depending on how it is calculated. In order to put these automatic classification results into perspective a user study on the human perception of the same data was conducted with average classification accuracy of 93%.
Asha Kapur, Ajay Kapur, Naznin Virji-Babul, George
Added 26 Jun 2010
Updated 26 Jun 2010
Type Conference
Year 2005
Where ACII
Authors Asha Kapur, Ajay Kapur, Naznin Virji-Babul, George Tzanetakis, Peter F. Driessen
Comments (0)
books