Sciweavers

Share
ECCV
2004
Springer

Real-Time Person Tracking and Pointing Gesture Recognition for Human-Robot Interaction

9 years 6 months ago
Real-Time Person Tracking and Pointing Gesture Recognition for Human-Robot Interaction
In this paper, we present our approach for visual tracking of head, hands and head orientation. Given the images provided by a calibrated stereo-camera, color and disparity information are integrated into a multi-hypotheses tracking framework in order to find the 3D-positions of the respective body parts. Based on the hands’ motion, an HMM-based approach is applied to recognize pointing gestures. We show experimentally, that the gesture recognition performance can be improved significantly by using visually gained information about head orientation as an additional feature. Our system aims at applications in the field of human-robot interaction, where it is important to do run-on recognition in real-time, to allow for robot’s egomotion and not to rely on manual initialization.
Kai Nickel, Rainer Stiefelhagen
Added 01 Jul 2010
Updated 01 Jul 2010
Type Conference
Year 2004
Where ECCV
Authors Kai Nickel, Rainer Stiefelhagen
Comments (0)
books