Sciweavers

Share
SASO
2008
IEEE

Pervasive Self-Learning with Multi-modal Distributed Sensors

9 years 8 months ago
Pervasive Self-Learning with Multi-modal Distributed Sensors
Truly ubiquitous computing poses new and significant challenges. A huge number of heterogeneous devices will interact to perform complex distributed tasks. One of the key aspects that will condition the role and impact of these new tecnologies on developed societies is how to obtain a manageable representation of the surrounding environment starting from simple sensing capabilities. This will make devices able to adapt their computing activities on an ever-changing environment. This paper presents a framework to promote unsupervised training processes among different sensors. This technology allows different sensors to exchange the needed knowledge to create a model to classify events happening into the environment. In particular we developed, as a case study, a multi-modal multi-sensor classification system combining data from a camera and a body-worn accelerometer to identify the user motion state. The body-worn accelerometer sensor learns a Gaussian mixture model of the user beha...
Nicola Bicocchi, Marco Mamei, Andrea Prati, Rita C
Added 01 Jun 2010
Updated 01 Jun 2010
Type Conference
Year 2008
Where SASO
Authors Nicola Bicocchi, Marco Mamei, Andrea Prati, Rita Cucchiara, Franco Zambonelli
Comments (0)
books