Expressive Gesture Model for Humanoid Robot

11 years 12 months ago
Expressive Gesture Model for Humanoid Robot
Abstract. This paper presents an expressive gesture model that generates communicative gestures accompanying speech for the humanoid robot Nao. The research work focuses mainly on the expressivity of robot gestures being coordinated with speech. To reach this objective, we have extended and developed our existing virtual agent platform GRETA to be adapted to the robot. Gestural prototypes are described symbolically and stored in a gestural database, called lexicon. Given a set of intentions and emotional states to communicate the system selects from the robot lexicon corresponding gestures. After that the selected gestures are planned to synchronize speech and then instantiated in robot joint values while taking into account parameters of gestural expressivity such as temporal extension, spatial extension, fluidity, power and repetitivity. In this paper, we will provide a detailed overview of our proposed model.
Le Quoc Anh, Catherine Pelachaud
Added 12 Dec 2011
Updated 12 Dec 2011
Type Journal
Year 2011
Where ACII
Authors Le Quoc Anh, Catherine Pelachaud
Comments (0)