Sciweavers

47 search results - page 4 / 10
» Multimodal 'eyes-free' interaction techniques for wearable d...
Sort
View
90
Voted
ICMI
2004
Springer
148views Biometrics» more  ICMI 2004»
15 years 5 months ago
ICARE software components for rapidly developing multimodal interfaces
Although several real multimodal systems have been built, their development still remains a difficult task. In this paper we address this problem of development of multimodal inte...
Jullien Bouchet, Laurence Nigay, Thierry Ganille
ISWC
2000
IEEE
15 years 4 months ago
What Shall We Teach Our Pants?
If a wearable device can register what the wearer is currently doing, it can anticipate and adjust its behavior to avoid redundant interaction with the user. However, the relevanc...
Kristof Van Laerhoven, Ozan Cakmakci
105
Voted
CHI
2004
ACM
16 years 13 hour ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay
ARTMED
2008
109views more  ARTMED 2008»
14 years 11 months ago
MOPET: A context-aware and user-adaptive wearable system for fitness training
Objective: Cardiovascular disease, obesity, and lack of physical fitness are increasingly common and negatively affect people's health, requiring medical assistance and decre...
Fabio Buttussi, Luca Chittaro
92
Voted
ISWC
2003
IEEE
15 years 5 months ago
Unsupervised, Dynamic Identification of Physiological and Activity Context in Wearable Computing
Context-aware computing describes the situation where a wearable / mobile computer is aware of its user’s state and surroundings and modifies its behavior based on this informat...
Andreas Krause, Daniel P. Siewiorek, Asim Smailagi...