Sciweavers

262 search results - page 19 / 53
» The Integrality of Speech in Multimodal Interfaces
Sort
View
108
Voted
LTCONF
2007
Springer
15 years 8 months ago
Spoken Language Interface for Mobile Devices
In this paper, we present a set of optimizations for a spoken language interface for mobile devices that can improve the recognition accuracy and user interaction experience. A com...
João Freitas, António Calado, Maria ...
ACL
2012
13 years 4 months ago
Probabilistic Integration of Partial Lexical Information for Noise Robust Haptic Voice Recognition
This paper presents a probabilistic framework that combines multiple knowledge sources for Haptic Voice Recognition (HVR), a multimodal input method designed to provide efficient...
Khe Chai Sim
120
Voted
JIRS
2008
110views more  JIRS 2008»
15 years 1 months ago
A Multi-Modal Haptic Interface for Virtual Reality and Robotics
In this paper we present an innovative haptic device that combines the electro-tactile stimulation with the force and visual feedbacks in order to improve the perception of a virtu...
Michele Folgheraiter, Giuseppina C. Gini, Dario L....
COLING
2002
15 years 1 months ago
Robust Interpretation of User Requests for Text Retrieval in a Multimodal Environment
We describe a parser for robust and flexible interpretation of user utterances in a multi-modal system for web search in newspaper databases. Users can speak or type, and they can...
Alexandra Klein, Estela Puig-Waldmüller, Hara...
SAMT
2007
Springer
138views Multimedia» more  SAMT 2007»
15 years 8 months ago
A Constraint-Based Graph Visualisation Architecture for Mobile Semantic Web Interfaces
Abstract. Multimodal and dialogue-based mobile interfaces to the Semantic Web offer access to complex knowledge and information structures. We explore more fine-grained co-ordina...
Daniel Sonntag, Philipp Heim