Sciweavers

709 search results - page 28 / 142
» A user interface framework for multimodal VR interactions
Sort
View
CHI
2005
ACM
15 years 10 months ago
Conversing with the user based on eye-gaze patterns
Motivated by and grounded in observations of eye-gaze patterns in human-human dialogue, this study explores using eye-gaze patterns in managing human-computer dialogue. We develop...
Pernilla Qvarfordt, Shumin Zhai
CHI
2009
ACM
15 years 10 months ago
City browser: developing a conversational automotive HMI
This paper introduces City Browser, a prototype multimodal, conversational, spoken language interface for automotive navigational aid and information access. A study designed to e...
Alexander Gruenstein, Bruce Mehler, Bryan Reimer, ...
KI
2007
Springer
15 years 4 months ago
Semantic Graph Visualisation for Mobile Semantic Web Interfaces
Information visualisation benefits from the Semantic Web: multimodal mobile interfaces to the Semantic Web offer access to complex knowledge and information structures. Natural l...
Daniel Sonntag, Philipp Heim
BCSHCI
2008
14 years 11 months ago
Exploring multimodal robotic interaction through storytelling for Aphasics
In this poster, we propose the design of a multimodal robotic interaction mechanism that is intended to be used by Aphasics for storytelling. Through limited physical interaction,...
Omar Mubin, Abdullah Al Mahmud
CHI
2009
ACM
15 years 10 months ago
A biologically inspired approach to learning multimodal commands and feedback for human-robot interaction
In this paper we describe a method to enable a robot to learn how a user gives commands and feedback to it by speech, prosody and touch. We propose a biologically inspired approac...
Anja Austermann, Seiji Yamada