Sciweavers

35 search results - page 4 / 7
» Interpretation of User Evaluation for Emotional Speech Synth...
Sort
View
AIHC
2007
Springer
13 years 12 months ago
Gaze-X: Adaptive, Affective, Multimodal Interface for Single-User Office Scenarios
This paper describes an intelligent system that we developed to support affective multimodal human-computer interaction (AMM-HCI) where the user’s actions and emotions are modele...
Ludo Maat, Maja Pantic
CHI
2008
ACM
14 years 6 months ago
Interactional empowerment
We propose that an interactional perspective on how emotion is constructed, shared and experienced, may be a good basis for designing affective interactional systems that do not i...
Anna Ståhl, Jarmo Laaksolahti, Kristina H&ou...
AIHC
2007
Springer
13 years 12 months ago
Affect Detection and an Automated Improvisational AI Actor in E-Drama
Enabling machines to understand emotions and feelings of the human users in their natural language textual input during interaction is a challenging issue in Human Computing. Our w...
Li Zhang, Marco Gillies, John A. Barnden, Robert J...
UIST
1992
ACM
13 years 9 months ago
Tools for Building Asynchronous Servers to Support Speech and Audio Applications
Distributed clientisewer models are becoming increasingly prevalent in multimedia systems and advanced user interface design. A multimedia application, for example, may play and r...
Barry Arons
UM
2010
Springer
13 years 9 months ago
Ranking Feature Sets for Emotion Models Used in Classroom Based Intelligent Tutoring Systems
Abstract. Recent progress has been made by using sensors with Intelligent Tutoring Systems in classrooms in order to predict the affective state of students users. If tutors are a...
David G. Cooper, Kasia Muldner, Ivon Arroyo, Bever...