Sciweavers

ICMI
2010
Springer

Facilitating multiparty dialog with gaze, gesture, and speech

13 years 2 months ago
Facilitating multiparty dialog with gaze, gesture, and speech
We study how synchronized gaze, gesture and speech rendered by an embodied conversational agent can influence the flow of conversations in multiparty settings. We review a computational framework for turn taking that provides the foundation for tracking and communicating intentions to hold, release, or take control of the conversational floor. We then present details of the implementation of the approach in an embodied conversational agent and describe experiments with the system in a shared task setting. Finally, we discuss results showing how the verbal and non-verbal cues used by the avatar can shape the dynamics of multiparty conversation. Categories and Subject Descriptors
Dan Bohus, Eric Horvitz
Added 12 Feb 2011
Updated 12 Feb 2011
Type Journal
Year 2010
Where ICMI
Authors Dan Bohus, Eric Horvitz
Comments (0)