Sciweavers

100 search results - page 17 / 20
» Signaling emotion in tagclouds
Sort
View
TSD
2007
Springer
15 years 3 months ago
ECAF: Authoring Language for Embodied Conversational Agents
Abstract. Embodied Conversational Agent (ECA) is the user interface metaphor that allows to naturally communicate information during human-computer interaction in synergic modality...
Ladislav Kunc, Jan Kleindienst
TSD
2010
Springer
14 years 7 months ago
Expressive Gibberish Speech Synthesis for Affective Human-Computer Interaction
In this paper we present our study on expressive gibberish speech synthesis as a means for affective communication between computing devices, such as a robot or an avatar, and thei...
Selma Yilmazyildiz, Lukas Latacz, Wesley Mattheyse...
ICASSP
2011
IEEE
14 years 1 months ago
Kernel cross-modal factor analysis for multimodal information fusion
This paper presents a novel approach for multimodal information fusion. The proposed method is based on kernel cross-modal factor analysis (KCFA), in which the optimal transformat...
Yongjin Wang, Ling Guan, Anastasios N. Venetsanopo...
ICASSP
2011
IEEE
14 years 1 months ago
Sparse non-negative decomposition of speech power spectra for formant tracking
Many works on speech processing have dealt with auto-regressive (AR) models for spectral envelope and formant frequency estimation, mostly focusing on the estimation of the AR par...
Jean-Louis Durrieu, Jean-Philippe Thiran
83
Voted
CHI
2010
ACM
15 years 2 months ago
Brain, body and bytes: psychophysiological user interaction
The human brain and body are prolific signal generators. Recent technologies and computing techniques allow us to measure, process and interpret these signals. We can now infer su...
Audrey Girouard, Erin Treacy Solovey, Regan L. Man...