Sciweavers

37 search results - page 6 / 8
» Using proxemics to evaluate human-robot interaction
Sort
View
HRI
2009
ACM
14 years 18 days ago
Footing in human-robot conversations: how robots might shape participant roles using gaze cues
During conversations, speakers establish their and others’ participant roles (who participates in the conversation and in what capacity)—or “footing” as termed by Goffman...
Bilge Mutlu, Toshiyuki Shiwa, Takayuki Kanda, Hiro...
HRI
2010
ACM
14 years 19 days ago
Lead me by the hand: evaluation of a direct physical interface for nursing assistant robots
—When a user is in close proximity to a robot, physical contact becomes a potentially valuable channel for communication. People often use direct physical contact to guide a pers...
Tiffany L. Chen, Charles C. Kemp
HRI
2011
ACM
12 years 9 months ago
Learning to interpret pointing gestures with a time-of-flight camera
Pointing gestures are a common and intuitive way to draw somebody’s attention to a certain object. While humans can easily interpret robot gestures, the perception of human beha...
David Droeschel, Jörg Stückler, Sven Beh...
HRI
2010
ACM
14 years 19 days ago
Multimodal interaction with an autonomous forklift
Abstract—We describe a multimodal framework for interacting with an autonomous robotic forklift. A key element enabling effective interaction is a wireless, handheld tablet with ...
Andrew Correa, Matthew R. Walter, Luke Fletcher, J...
CHI
2009
ACM
14 years 6 months ago
A biologically inspired approach to learning multimodal commands and feedback for human-robot interaction
In this paper we describe a method to enable a robot to learn how a user gives commands and feedback to it by speech, prosody and touch. We propose a biologically inspired approac...
Anja Austermann, Seiji Yamada