Sciweavers

709 search results - page 54 / 142
» A user interface framework for multimodal VR interactions
Sort
View
IUI
2003
ACM
15 years 3 months ago
Intelligent dialog overcomes speech technology limitations: the SENECa example
We present a primarily speech-based user interface to a wide range of entertainment, navigation and communication applications for use in vehicles. The multimodal dialog enables t...
Wolfgang Minker, Udo Haiber, Paul Heisterkamp, Sve...
VR
2008
IEEE
138views Virtual Reality» more  VR 2008»
15 years 4 months ago
Assessing the Effects of Orientation and Device on 3D Positioning
We present two studies to assess which physical factors of various input devices influence 3D object movement tasks. In particular, we evaluate the factors that seem to make the m...
Robert J. Teather, Wolfgang Stürzlinger
MHCI
2009
Springer
15 years 4 months ago
Expectations for user experience in haptic communication with mobile devices
The haptic modality – the sense of touch – is utilized very limitedly in current human-computer interaction. Especially in mobile communication, the haptic modality could prov...
Jani Heikkinen, Thomas Olsson, Kaisa Vää...
WWW
2009
ACM
15 years 10 months ago
Raise semantics at the user level for dynamic and interactive SOA-based portals
In this paper, we describe the fully dynamic semantic portal we implemented, integrating Semantic Web technologies and Service Oriented Architecture (SOA). The goals of the portal...
Jean-Sébastien Brunner, Patrick Gatellier
CORR
2010
Springer
164views Education» more  CORR 2010»
14 years 10 months ago
Gaze and Gestures in Telepresence: multimodality, embodiment, and roles of collaboration
This paper proposes a controlled experiment to further investigate the usefulness of gaze awareness and gesture recognition in the support of collaborative work at a distance. We ...
Mauro Cherubini, Rodrigo de Oliveira, Nuria Oliver...