Sciweavers

Share
MOBISYS
2009
ACM

SoundSense: scalable sound sensing for people-centric applications on mobile phones

9 years 10 months ago
SoundSense: scalable sound sensing for people-centric applications on mobile phones
Top end mobile phones include a number of specialized (e.g., accelerometer, compass, GPS) and general purpose sensors (e.g., microphone, camera) that enable new people-centric sensing applications. Perhaps the most ubiquitous and unexploited sensor on mobile phones is the microphone ? a powerful sensor that is capable of making sophisticated inferences about human activity, location, and social events from sound. In this paper, we exploit this untapped sensor not in the context of human communications but as an enabler of new sensing applications. We propose SoundSense, a scalable framework for modeling sound events on mobile phones. SoundSense is implemented on the Apple iPhone and represents the first general purpose sound sensing system specifically designed to work on resource limited phones. The architecture and algorithms are designed for scalability and SoundSense uses a combination of supervised and unsupervised learning techniques to classify both general sound types (e.g., m...
Hong Lu, Wei Pan, Nicholas D. Lane, Tanzeem Choudh
Added 25 Nov 2009
Updated 25 Nov 2009
Type Conference
Year 2009
Where MOBISYS
Authors Hong Lu, Wei Pan, Nicholas D. Lane, Tanzeem Choudhury, Andrew T. Campbell
Comments (0)
books