Sciweavers

ISWC
2003
IEEE

Using Multiple Sensors for Mobile Sign Language Recognition

13 years 9 months ago
Using Multiple Sensors for Mobile Sign Language Recognition
We build upon a constrained, lab-based Sign Language recognition system with the goal of making it a mobile assistive technology. We examine using multiple sensors for disambiguation of noisy data to improve recognition accuracy. Our experiment compares the results of training a small gesture vocabulary using noisy vision data, accelerometer data and both data sets combined.
Helene Brashear, Thad Starner, Paul Lukowicz, Holg
Added 04 Jul 2010
Updated 04 Jul 2010
Type Conference
Year 2003
Where ISWC
Authors Helene Brashear, Thad Starner, Paul Lukowicz, Holger Junker
Comments (0)