This contribution presents our work towards a system that autonomously guides the user's visual attention on important information (e.g., traffic situation or in-car system st...
This paper describes an on-going research project at the MIT Media Lab, exploring the use of auditory I/O as a primary interaction modality for wearable computing. Nomadic Radio i...
The paper presents the design of a mobile assistant for runners. We propose visual and auditory user interface for a mobile assistant, called Mobota. The system supports navigatio...
This paper reports an experiment into the design of crossmodal icons which can provide an alternative form of output for mobile devices using audio and tactile modalities to commu...
Auditory menus have the potential to make devices that use visual menus accessible to a wide range of users. Visually impaired users could especially benefit from the auditory fee...