Auditory Context Awareness in Wearable Computing

Brian Clarkson, Nitin Sawhney, Alex Pentland


We describe a system for obtaining environmental context through audio for applications and user interfaces. We are interested in identifying specific auditory events such as speakers, cars, and shutting doors, and auditory scenes such as the office, supermarket, or busy street. Our goal is to construct a system that is real-time and robust to a variety of real-world environments. The current system detects and classifies events and scenes using a HMM framework. We design the system around an adaptive structure that relies on unsupervised training for segmentation of sound scenes.

Related Content