Sciweavers

LRE
2008

IEMOCAP: interactive emotional dyadic motion capture database

13 years 4 months ago
IEMOCAP: interactive emotional dyadic motion capture database
Since emotions are expressed through a combination of verbal and non-verbal channels, a joint analysis of speech and gestures is required to understand expressive human communication. To facilitate such investigations, this paper describes a new corpus named the "interactive emotional dyadic motion capture database" (IEMOCAP), collected by the Speech Analysis and Interpretation Laboratory (SAIL) at the University of Southern California (USC). This database was recorded from ten actors in dyadic sessions with markers on the face, head, and hands, which provide detailed information about their facial expression and hand movements during scripted and spontaneous spoken communication scenarios. The actors performed selected emotional scripts and also improvised hypothetical scenarios designed to elicit specific types of emotions (happiness, anger, sadness, frustration and neutral state). The corpus contains approximately twelve hours of data. The detailed motion capture informati...
Carlos Busso, Murtaza Bulut, Chi-Chun Lee, Abe Kaz
Added 13 Dec 2010
Updated 13 Dec 2010
Type Journal
Year 2008
Where LRE
Authors Carlos Busso, Murtaza Bulut, Chi-Chun Lee, Abe Kazemzadeh, Emily Mower, Samuel Kim, Jeannette N. Chang, Sungbok Lee, Shrikanth Narayanan
Comments (0)