Sciweavers

Share
ICPR
2006
IEEE

Human-Robot Interaction by Whole Body Gesture Spotting and Recognition

10 years 2 months ago
Human-Robot Interaction by Whole Body Gesture Spotting and Recognition
An intelligent robot is required for natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI research focused on issues such as hand gesture, sign language, and command gesture recognition. Automatic recognition of whole body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole body gestures, is a complex task. This paper presents a new method for recognition of whole body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3D. A feature vector is then mapped to a codeword of gesture HMMs. In order to spot key gestures accurately, a sophisticated method of designing a garbage gesture model is proposed; model reduction, which merges similar states, based on data-dependent statistics and relat...
A-Yeon Park, Hee-Deok Yang, Seong-Whan Lee
Added 09 Nov 2009
Updated 09 Nov 2009
Type Conference
Year 2006
Where ICPR
Authors A-Yeon Park, Hee-Deok Yang, Seong-Whan Lee
Comments (0)
books