Sciweavers

NORDICHI
2006
ACM

Crossmodal spatial location: initial experiments

13 years 10 months ago
Crossmodal spatial location: initial experiments
This paper describes an alternative form of interaction for mobile devices using crossmodal output. The aim of our work is to investigate the equivalence of audio and tactile displays so that the same messages can be presented in one form or another. Initial experiments show that spatial location can be perceived as equivalent in both the auditory and tactile modalities Results show that participants are able to map presented 3D audio positions to tactile body positions on the waist most effectively when mobile and that there are significantly more errors made when using the ankle or wrist. This paper compares the results from both a static and mobile experiment on crossmodal spatial location and outlines the most effective ways to use this crossmodal output in a mobile context. Author Keywords Tactons, Earcons, spatial location, mobility ACM Classification Keywords H5.2. User Interfaces: Auditory (non-speech) Feedback, Haptic I/O, Interaction Styles.
Eve E. Hoggan, Stephen A. Brewster
Added 14 Jun 2010
Updated 14 Jun 2010
Type Conference
Year 2006
Where NORDICHI
Authors Eve E. Hoggan, Stephen A. Brewster
Comments (0)