Sciweavers

146 search results - page 11 / 30
» A Gesture Based Interface for Human-Robot Interaction
Sort
View
AROBOTS
2002
166views more  AROBOTS 2002»
14 years 11 months ago
Multi-Modal Interaction of Human and Home Robot in the Context of Room Map Generation
In robotics, the idea of human and robot interaction is receiving a lot of attention lately. In this paper, we describe a multi-modal system for generating a map of the environment...
Saeed Shiry Ghidary, Yasushi Nakata, Hiroshi Saito...
FGR
2004
IEEE
164views Biometrics» more  FGR 2004»
15 years 3 months ago
Real-Time Pointing Gesture Recognition for an Immersive Environment
We present an algorithm for the real-time detection and interpretation of pointing gestures, performed with one or both arms. The pointing gestures are used as an intuitive tracki...
Roland Kehl, Luc J. Van Gool
HCI
2009
14 years 9 months ago
Interactive Demonstration of Pointing Gestures for Virtual Trainers
Abstract. While interactive virtual humans are becoming widely used in education, training and delivery of instructions, building the animations required for such interactive chara...
Yazhou Huang, Marcelo Kallmann
ICMI
2005
Springer
170views Biometrics» more  ICMI 2005»
15 years 5 months ago
Inferring body pose using speech content
Untethered multimodal interfaces are more attractive than tethered ones because they are more natural and expressive for interaction. Such interfaces usually require robust vision...
Sy Bor Wang, David Demirdjian
HRI
2009
ACM
15 years 6 months ago
CALLY: the cell-phone robot with affective expressions
This poster describes a robot cell-phone named CALLY with which we are exploring the roles of facial and gestural expressions of robotic products in the human computer interaction...
Ji-Dong Yim, Christopher D. Shaw