Sciweavers

30 search results - page 1 / 6
» Optically sensing tongue gestures for computer input
Sort
View
UIST
2009
ACM
13 years 11 months ago
Optically sensing tongue gestures for computer input
Many patients with paralyzing injuries or medical conditions retain the use of their cranial nerves, which control the eyes, jaw, and tongue. While researchers have explored eye-t...
T. Scott Saponas, Daniel Kelly, Babak A. Parviz, D...
ISWC
1998
IEEE
13 years 9 months ago
A Novel Method for Joint Motion Sensing on a Wearable Computer
Some wearable computing applications require sensing devices that detect the deflection of joints during human motion. Presented here is a novel non-invasive technique for measuri...
Aaron Toney
CHI
2011
ACM
12 years 8 months ago
Your noise is my command: sensing gestures using the body as an antenna
Touch sensing and computer vision have made humancomputer interaction possible in environments where keyboards, mice, or other handheld implements are not available or desirable. ...
Gabe Cohn, Daniel Morris, Shwetak N. Patel, Desney...
CHI
2006
ACM
14 years 5 months ago
Cooperative gestures: multi-user gestural interactions for co-located groupware
Multi-user, touch-sensing input devices create opportunities for the use of cooperative gestures ? multi-user gestural interactions for single display groupware. Cooperative gestu...
Meredith Ringel Morris, Anqi Huang, Andreas Paepck...
ICDE
2012
IEEE
273views Database» more  ICDE 2012»
11 years 7 months ago
Data3 - A Kinect Interface for OLAP Using Complex Event Processing
—Motion sensing input devices like Microsoft’s Kinect offer an alternative to traditional computer input devices like keyboards and mouses. Daily new applications using this in...
Steffen Hirte, Andreas Seifert, Stephan Baumann, D...