Sciweavers

47 search results - page 5 / 10
» Multimodal 'eyes-free' interaction techniques for wearable d...
Sort
View
IUI
2005
ACM
15 years 3 months ago
Multimodal new vocabulary recognition through speech and handwriting in a whiteboard scheduling application
Our goal is to automatically recognize and enroll new vocabulary in a multimodal interface. To accomplish this our technique aims to leverage the mutually disambiguating aspects o...
Edward C. Kaiser
TEI
2010
ACM
100views Hardware» more  TEI 2010»
14 years 9 months ago
Coming to grips with the objects we grasp: detecting interactions with efficient wrist-worn sensors
The use of a wrist-worn sensor that is able to read nearby RFID tags and the wearer's gestures has been suggested frequently as a way to both detect the objects we interact w...
Eugen Berlin, Jun Liu, Kristof Van Laerhoven, Bern...
UIST
1994
ACM
15 years 1 months ago
Extending a Graphical Toolkit for Two-handed Interaction
Multimodal interaction combines input from multiple sensors such as pointing devices or speech recognition systems, in order to achieve more fluid and natural interaction. Twohand...
Stéphane Chatty
RIAO
2007
14 years 10 months ago
Multimodal Segmentation of Lifelog Data
A personal lifelog of visual and audio information can be very helpful as a human memory augmentation tool. The SenseCam, a passive wearable camera, used in conjunction with an iR...
Aiden R. Doherty, Alan F. Smeaton, Keansub Lee, Da...
ICMI
2003
Springer
150views Biometrics» more  ICMI 2003»
15 years 2 months ago
Capturing user tests in a multimodal, multidevice informal prototyping tool
Interaction designers are increasingly faced with the challenge of creating interfaces that incorporate multiple input modalities, such as pen and speech, and span multiple device...
Anoop K. Sinha, James A. Landay