Sciweavers

Share
ICMCS
2006
IEEE

Combined Gesture-Speech Analysis and Speech Driven Gesture Synthesis

9 years 7 months ago
Combined Gesture-Speech Analysis and Speech Driven Gesture Synthesis
Multimodal speech and speaker modeling and recognition are widely accepted as vital aspects of state of the art human-machine interaction systems. While correlations between speech and lip motion as well as speech and facial expressions are widely studied, relatively little work has been done to investigate the correlations between speech and gesture. Detection and modeling of head, hand and arm gestures of a speaker have been studied extensively and these gestures were shown to carry linguistic information. A typical example is the head gesture while saying ”yes/no”. In this study, correlation between gestures and speech is investigated. In speech signal analysis, keyword spotting and prosodic accent event detection has been performed. In gesture analysis, hand positions and parameters of global head motion are used as features. The detection of gestures is based on discrete predesignated symbol sets, which are manually labeled during the training phase. The gesture-speech correl...
Mehmet Emre Sargin, Oya Aran, Alexey Karpov, Ferda
Added 11 Jun 2010
Updated 11 Jun 2010
Type Conference
Year 2006
Where ICMCS
Authors Mehmet Emre Sargin, Oya Aran, Alexey Karpov, Ferda Ofli, Yelena Yasinnik, Stephen Wilson, Engin Erzin, Yücel Yemez, A. Murat Tekalp
Comments (0)
books