Sciweavers

4 search results - page 1 / 1
» Visual attention in spoken human-robot interaction
Sort
View
HRI
2009
ACM
13 years 11 months ago
Visual attention in spoken human-robot interaction
Psycholinguistic studies of situated language processing have revealed that gaze in the visual environment is tightly coupled with both spoken language comprehension and productio...
Maria Staudte, Matthew W. Crocker
IWANN
2009
Springer
13 years 10 months ago
Integrating Graph-Based Vision Perception to Spoken Conversation in Human-Robot Interaction
In this paper we present the integration of graph-based visual perception to spoken conversation in human-robot interaction. The proposed architecture has a dialogue manager as the...
Wendy Aguilar, Luis A. Pineda
ICMI
2010
Springer
217views Biometrics» more  ICMI 2010»
13 years 2 months ago
Focusing computational visual attention in multi-modal human-robot interaction
Identifying verbally and non-verbally referred-to objects is an important aspect of human-robot interaction. Most importantly, it is essential to achieve a joint focus of attentio...
Boris Schauerte, Gernot A. Fink
HRI
2006
ACM
13 years 10 months ago
Working with robots and objects: revisiting deictic reference for achieving spatial common ground
Robust joint visual attention is necessary for achieving a common frame of reference between humans and robots interacting multimodally in order to work together on realworld spat...
Andrew G. Brooks, Cynthia Breazeal