Sciweavers

14 search results - page 1 / 3
» Recognizing emotions in dialogues with acoustic and lexical ...
Sort
View
ACL
2004
13 years 5 months ago
Predicting Student Emotions in Computer-Human Tutoring Dialogues
We examine the utility of speech and lexical features for predicting student emotions in computerhuman spoken tutoring dialogues. We first annotate student turns for negative, neu...
Diane J. Litman, Katherine Forbes-Riley
ICMCS
2005
IEEE
97views Multimedia» more  ICMCS 2005»
13 years 9 months ago
Visual/Acoustic Emotion Recognition
To recognize and understand a person’s emotion has been known as one of the most important issue in human-computer interaction. In this paper, we present a multimodal system tha...
Cheng-Yao Chen, Yue-Kai Huang, Perry Cook
INTERSPEECH
2010
12 years 10 months ago
Acoustic feature analysis in speech emotion primitives estimation
We recently proposed a family of robust linear and nonlinear estimation techniques for recognizing the three emotion primitives
Dongrui Wu, Thomas D. Parsons, Shrikanth S. Naraya...
JMM2
2007
106views more  JMM2 2007»
13 years 3 months ago
Lexical Structure for Dialogue Act Recognition
— This paper deals with automatic dialogue acts (DAs) recognition in Czech. Dialogue acts are sentence-level labels that represent different states of a dialogue, such as questio...
Pavel Král, Christophe Cerisara, Jana Kleck...
IEAAIE
2004
Springer
13 years 9 months ago
Recognition of Emotional States in Spoken Dialogue with a Robot
For flexible interactions between a robot and humans, we address the issue of automatic recognition of human emotions during the interaction such as embarrassment, pleasure, and af...
Kazunori Komatani, Ryosuke Ito, Tatsuya Kawahara, ...