Sciweavers

AUTOMOTIVEUI
2009
ACM

Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies

13 years 11 months ago
Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies
Auditory display research for driving has mainly focused on collision warning signals, and recent studies on auditory invehicle information presentation have examined only a limited range of tasks (e.g., cell phone operation tasks or verbal tasks such as reading digit strings). The present study used a dual task paradigm to evaluate a plausible scenario in which users navigated a song list. We applied enhanced auditory menu navigation cues, including spearcons (i.e., compressed speech) and a spindex (i.e., a speech index that used brief audio cues to communicate the user’s position in a long menu list). Twentyfour undergraduates navigated through an alphabetized song list of 150 song titles—rendered as an auditory menu—while they concurrently played a simple, perceptual-motor, ball-catching game. The menu was presented with text-to-speech (TTS) alone, TTS plus one of three types of enhanced auditory cues, or no sound at all. Both performance of the primary task (success rate of ...
Myounghoon Jeon, Benjamin K. Davison, Michael A. N
Added 28 May 2010
Updated 28 May 2010
Type Conference
Year 2009
Where AUTOMOTIVEUI
Authors Myounghoon Jeon, Benjamin K. Davison, Michael A. Nees, Jeff Wilson, Bruce N. Walker
Comments (0)