Sciweavers

70 search results - page 8 / 14
» Emotional Facial Expression Classification for Multimodal Us...
Sort
View
GW
2005
Springer
141views Biometrics» more  GW 2005»
15 years 10 months ago
From Acoustic Cues to an Expressive Agent
This work proposes a new way for providing feedback to expressivity in music performance. Starting from studies on the expressivity of music performance we developed a system in wh...
Maurizio Mancini, Roberto Bresin, Catherine Pelach...
TSD
2007
Springer
15 years 11 months ago
ECAF: Authoring Language for Embodied Conversational Agents
Abstract. Embodied Conversational Agent (ECA) is the user interface metaphor that allows to naturally communicate information during human-computer interaction in synergic modality...
Ladislav Kunc, Jan Kleindienst
ICMI
2009
Springer
95views Biometrics» more  ICMI 2009»
15 years 11 months ago
Multimodal inference for driver-vehicle interaction
In this paper we present a novel system for driver-vehicle interaction which combines speech recognition with facialexpression recognition to increase intention recognition accura...
Tevfik Metin Sezgin, Ian Davies, Peter Robinson
CHI
2005
ACM
16 years 5 months ago
HOMIE: an artificial companion for elderly people
In this paper we present "Homie" an artificial companion for elderly people. Our approach emphasizes amusement and benefit - amusement in form of entertainment and benef...
Simone Kriglstein, Gunter Wallner
125
Voted
CHI
2007
ACM
16 years 5 months ago
Expressing emotion in text-based communication
Our ability to express and accurately assess emotional states is central to human life. The present study examines how people express and detect emotions during text-based communi...
Jeffrey T. Hancock, Christopher Landrigan, Courtne...