Sciweavers

709 search results - page 36 / 142
» A user interface framework for multimodal VR interactions
Sort
View
ICMI
2004
Springer
189views Biometrics» more  ICMI 2004»
15 years 3 months ago
A multimodal learning interface for sketch, speak and point creation of a schedule chart
We present a video demonstration of an agent-based test bed application for ongoing research into multi-user, multimodal, computer-assisted meetings. The system tracks a two perso...
Edward C. Kaiser, David Demirdjian, Alexander Grue...
AUTOMOTIVEUI
2009
ACM
15 years 4 months ago
Towards a flexible UI model for automotive human-machine interaction
In this paper we present an approach for creating user infrom abstract representations for the automotive domain. The approach is based on transformations between different user ...
Guido M. de Melo, Frank Honold, Michael Weber, Mar...
VR
2002
IEEE
149views Virtual Reality» more  VR 2002»
15 years 2 months ago
Tension Based 7-DOF Force Feedback Device: SPIDAR-G
In this paper, we intend to demonstrate a new intuitive force-feedback device for advanced VR applications. Force feedback for the device is tension based and is characterized by ...
Seahak Kim, Shoichi Hasegawa, Yasuharu Koike, Mako...
TSD
2007
Springer
15 years 4 months ago
ECAF: Authoring Language for Embodied Conversational Agents
Abstract. Embodied Conversational Agent (ECA) is the user interface metaphor that allows to naturally communicate information during human-computer interaction in synergic modality...
Ladislav Kunc, Jan Kleindienst
LREC
2010
206views Education» more  LREC 2010»
14 years 11 months ago
DICIT: Evaluation of a Distant-talking Speech Interface for Television
The EC-funded project DICIT developed distant-talking interfaces for interactive TV. The final DICIT prototype system processes multimodal user input by speech and remote control....
Timo Sowa, Fiorenza Arisio, Luca Cristoforetti