Sciweavers

47 search results - page 4 / 10
» Multimodal 'eyes-free' interaction techniques for wearable d...
Sort
View
ICMI
2004
Springer
148views Biometrics» more  ICMI 2004»
15 years 2 months ago
ICARE software components for rapidly developing multimodal interfaces
Although several real multimodal systems have been built, their development still remains a difficult task. In this paper we address this problem of development of multimodal inte...
Jullien Bouchet, Laurence Nigay, Thierry Ganille
ISWC
2000
IEEE
15 years 1 months ago
What Shall We Teach Our Pants?
If a wearable device can register what the wearer is currently doing, it can anticipate and adjust its behavior to avoid redundant interaction with the user. However, the relevanc...
Kristof Van Laerhoven, Ozan Cakmakci
CHI
2004
ACM
15 years 9 months ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay
ARTMED
2008
109views more  ARTMED 2008»
14 years 9 months ago
MOPET: A context-aware and user-adaptive wearable system for fitness training
Objective: Cardiovascular disease, obesity, and lack of physical fitness are increasingly common and negatively affect people's health, requiring medical assistance and decre...
Fabio Buttussi, Luca Chittaro
ISWC
2003
IEEE
15 years 2 months ago
Unsupervised, Dynamic Identification of Physiological and Activity Context in Wearable Computing
Context-aware computing describes the situation where a wearable / mobile computer is aware of its user’s state and surroundings and modifies its behavior based on this informat...
Andreas Krause, Daniel P. Siewiorek, Asim Smailagi...