Sciweavers

10 search results - page 2 / 2
» Recognition of Deictic Gestures for Wearable Computing
Sort
View
CHI
2003
ACM
14 years 5 months ago
Multimodal 'eyes-free' interaction techniques for wearable devices
Mobile and wearable computers present input/output problems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on na...
Stephen A. Brewster, Joanna Lumsden, Marek Bell, M...
ISWC
2005
IEEE
13 years 11 months ago
Using Ultrasonic Hand Tracking to Augment Motion Analysis Based Recognition of Manipulative Gestures
The paper demonstrates how ultrasonic hand tracking can be used to improve the performance of a wearable, accelerometer and gyroscope based activity recognition system. Specifica...
Georg Ogris, Thomas Stiefmeier, Holger Junker, Pau...
CSE
2009
IEEE
14 years 1 days ago
Performance Analysis of an HMM-Based Gesture Recognition Using a Wristwatch Device
—Interaction with mobile devices that are intended for everyday use is challenging since such systems are continuously optimized towards small outlines. Watches are a particularl...
Roman Amstutz, Oliver Amft, Brian French, Asim Sma...
IUI
2012
ACM
12 years 26 days ago
Airwriting: demonstrating mobile text input by 3D-space handwriting
We demonstrate our airwriting interface for mobile handsfree text entry. The interface enables a user to input text into a computer by writing in the air like on an imaginary blac...
Christoph Amma, Tanja Schultz
MOBILITY
2009
ACM
13 years 11 months ago
A pervasive gesture-driven augmented reality prototype using wireless sensor body area networks
This paper describes the prototype implementation of a pervasive, wearable augmented reality (AR) system based on a full bodymotion-capture system using low-power wireless sensors...
Peter Barrie, Andreas Komninos, Oleksii Mandrychen...