Sciweavers

Share
IUI
2012
ACM

Continuous recognition of one-handed and two-handed gestures using 3D full-body motion tracking sensors

7 years 5 months ago
Continuous recognition of one-handed and two-handed gestures using 3D full-body motion tracking sensors
In this paper we present a new bimanual markerless gesture interface for 3D full-body motion tracking sensors, such as the Kinect. Our interface uses a probabilistic algorithm to incrementally predict users’ intended one-handed and twohanded gestures while they are still being articulated. It supports scale and translation invariant recognition of arbitrarily defined gesture templates in real-time. The interface supports two ways of gesturing commands in thin air to displays at a distance. First, users can use one-handed and two-handed gestures to directly issue commands. Second, users can use their non-dominant hand to modulate single-hand gestures. Our evaluation shows that the system recognizes one-handed and two-handed gestures with an accuracy of 92.7%–96.2%. Author Keywords Gesture recognition, motion tracking, wall-sized displays ACM Classification Keywords I.5.5 Pattern Recognition: Implementation—Interactive systems General Terms Experimentation, Human Factors
Per Ola Kristensson, Thomas Nicholson, Aaron J. Qu
Added 25 Apr 2012
Updated 25 Apr 2012
Type Journal
Year 2012
Where IUI
Authors Per Ola Kristensson, Thomas Nicholson, Aaron J. Quigley
Comments (0)
books