Sciweavers

262 search results - page 32 / 53
» The Integrality of Speech in Multimodal Interfaces
Sort
View
LREC
2010
183views Education» more  LREC 2010»
15 years 1 months ago
AhoTransf: A Tool for Multiband Excitation Based Speech Analysis and Modification
In this paper we present AhoTransf, a tool that enables analysis, visualization, modification and synthesis of speech. AhoTransf integrates a speech signal analysis model with a g...
Ibon Saratxaga, Inmaculada Hernáez, Eva Nav...
AFRIGRAPH
2001
ACM
15 years 3 months ago
A gesture processing framework for multimodal interaction in virtual reality
This article presents a gesture detection and analysis framework for modelling multimodal interactions. It is particulary designed for its use in Virtual Reality (VR) applications...
Marc Erich Latoschik
CORR
2010
Springer
164views Education» more  CORR 2010»
14 years 12 months ago
Gaze and Gestures in Telepresence: multimodality, embodiment, and roles of collaboration
This paper proposes a controlled experiment to further investigate the usefulness of gaze awareness and gesture recognition in the support of collaborative work at a distance. We ...
Mauro Cherubini, Rodrigo de Oliveira, Nuria Oliver...
HHCI
2000
15 years 3 months ago
The Effective Combination of Haptic and Auditory Textural Information
With the increasing availability and quality of auditory and haptic means of interaction, it is not unusual to incorporate many modalities in interfaces rather than the purely vis...
Marilyn Rose McGee, Philip D. Gray, Stephen A. Bre...
MM
2004
ACM
81views Multimedia» more  MM 2004»
15 years 5 months ago
Interactive manipulation of replay speed while listening to speech recordings
Today’s interfaces for time-scaled audio replay have limitations especially regarding highly interactive tasks such as skimming and searching, which require quick temporary spee...
Wolfgang Hürst, Tobias Lauer, Georg Götz