Sciweavers

67 search results - page 2 / 14
» Rapid Detection of Emotion from Human Vocalizations
Sort
View
ICASSP
2011
IEEE
12 years 8 months ago
Online detection of vocal Listener Responses with maximum latency constraints
When human listeners utter Listener Responses (e.g. back-channels or acknowledgments) such as ‘yeah’ and ‘mmhmm’, interlocutors commonly continue to speak or resume their ...
Daniel Neiberg, Khiet P. Truong
CHI
2008
ACM
14 years 5 months ago
MySong: automatic accompaniment generation for vocal melodies
We introduce MySong, a system that automatically chooses chords to accompany a vocal melody. A user with no musical experience can create a song with instrumental accompaniment ju...
Ian Simon, Dan Morris, Sumit Basu
ICMI
2007
Springer
183views Biometrics» more  ICMI 2007»
13 years 11 months ago
A survey of affect recognition methods: audio, visual and spontaneous expressions
Automated analysis of human affective behavior has attracted increasing attention from researchers in psychology, computer science, linguistics, neuroscience, and related discipli...
Zhihong Zeng, Maja Pantic, Glenn I. Roisman, Thoma...
IUI
2006
ACM
13 years 11 months ago
A cognitively based approach to affect sensing from text
Studying the relationship between natural language and affective information as well as assessing the underpinned affective qualities of natural language are becoming crucial for ...
Shaikh Mostafa Al Masum, Helmut Prendinger, Mitsur...
NAACL
2003
13 years 6 months ago
Towards Emotion Prediction in Spoken Tutoring Dialogues
Human tutors detect and respond to student emotional states, but current machine tutors do not. Our preliminary machine learning experiments involving transcription, emotion annot...
Diane J. Litman, Katherine Forbes, Scott Silliman