Sciweavers

76 search results - page 2 / 16
» Predicting Subjectivity in Multimodal Conversations
Sort
View
ICIA
2007
13 years 7 months ago
Eye Gaze for Attention Prediction in Multimodal Human-Machine Conversation
In a conversational system, determining a user’s focus of attention is crucial to the success of the system. Motivated by previous psycholinguistic findings, we are currently e...
Zahar Prasov, Joyce Yue Chai, Hogyeong Jeong
ICMI
2004
Springer
196views Biometrics» more  ICMI 2004»
13 years 10 months ago
Evaluation of spoken multimodal conversation
Spoken multimodal dialogue systems in which users address faceonly or embodied interface agents have been gaining ground in research for some time. Although most systems are still...
Niels Ole Bernsen, Laila Dybkjær
MM
2005
ACM
130views Multimedia» more  MM 2005»
13 years 11 months ago
Multimodal expressive embodied conversational agents
In this paper we present our work toward the creation of a multimodal expressive Embodied Conversational Agent (ECA). Our agent, called Greta, exhibits nonverbal behaviors synchro...
Catherine Pelachaud
ICASSP
2010
IEEE
13 years 5 months ago
Predicting interruptions in dyadic spoken interactions
Interruptions occur frequently in spontaneous conversations, and they are often associated with changes in the flow of conversation. Predicting interruption is essential in the d...
Chi-Chun Lee, Shrikanth Narayanan
ICMI
2005
Springer
136views Biometrics» more  ICMI 2005»
13 years 11 months ago
Contextual recognition of head gestures
Head pose and gesture offer several key conversational grounding cues and are used extensively in face-to-face interaction among people. We investigate how dialog context from an ...
Louis-Philippe Morency, Candace L. Sidner, Christo...