Sciweavers

Share
HRI
2009
ACM

Visual attention in spoken human-robot interaction

12 years 8 months ago
Visual attention in spoken human-robot interaction
Psycholinguistic studies of situated language processing have revealed that gaze in the visual environment is tightly coupled with both spoken language comprehension and production. It has also been established that interlocutors monitor the gaze of their partners, a phenomenon called "joint attention", as a further means for facilitating mutual understanding. We hypothesise that humanrobot interaction will benefit when the robot’s language-related gaze behaviour is similar to that of people, potentially providing the user with valuable non-verbal information concerning the robot’s intended message or the robot’s successful understanding. We report findings from two eye-tracking experiments demonstrating (1) that human gaze is modulated by both the robot speech and gaze, and (2) that human comprehension of robot speech is improved when the robot’s real-time gaze behaviour is similar to that of humans. Categories and Subject Descriptors I.2.9 [Artificial Intelligen...
Maria Staudte, Matthew W. Crocker
Added 19 May 2010
Updated 19 May 2010
Type Conference
Year 2009
Where HRI
Authors Maria Staudte, Matthew W. Crocker
Comments (0)
books