Sciweavers

IJMMS
2008

Real-time classification of evoked emotions using facial feature tracking and physiological responses

13 years 4 months ago
Real-time classification of evoked emotions using facial feature tracking and physiological responses
We present automated, real-time models built with machine learning algorithms which use videotapes of subjects' faces in conjunction with physiological measurements to predict rated emotion (trained coders' second-by-second assessments of sadness or amusement). Input consisted of videotapes of 41 subjects watching emotionally evocative films along with measures of their cardiovascular activity, somatic activity, and electrodermal responding. We built algorithms based on extracted points from the subjects' faces as well as their physiological responses. Strengths of the current approach are (1) we are assessing real behavior of subjects watching emotional videos instead of actors making facial poses, (2) the training data allow us to predict both emotion type (amusement versus sadness) as well as the intensity level of each emotion, (3) we provide a direct comparison between person-specific, gender-specific, and general models. Results demonstrated good fits for the mode...
Jeremy N. Bailenson, Emmanuel D. Pontikakis, Iris
Added 12 Dec 2010
Updated 12 Dec 2010
Type Journal
Year 2008
Where IJMMS
Authors Jeremy N. Bailenson, Emmanuel D. Pontikakis, Iris B. Mauss, James J. Gross, Maria E. Jabon, Cendri A. C. Hutcherson, Clifford Nass, Oliver John
Comments (0)