Sciweavers

21 search results - page 3 / 5
» Multimodal data communication for human-robot interactions
Sort
View
HRI
2010
ACM
14 years 19 days ago
Investigating multimodal real-time patterns of joint attention in an hri word learning task
Abstract—Joint attention – the idea that humans make inferences from observable behaviors of other humans by attending to the objects and events that these others humans attend...
Chen Yu, Matthias Scheutz, Paul W. Schermerhorn
GW
1999
Springer
194views Biometrics» more  GW 1999»
13 years 10 months ago
Communicative Rhythm in Gesture and Speech
Led by the fundamental role that rhythms apparently play in speech and gestural communication among humans, this study was undertaken to substantiate a biologically motivated model...
Ipke Wachsmuth
LREC
2010
132views Education» more  LREC 2010»
13 years 7 months ago
The NOMCO Multimodal Nordic Resource - Goals and Characteristics
This paper presents the multimodal corpora that are being collected and annotated in the Nordic NOMCO project. The corpora will be used to study communicative phenomena such as fe...
Patrizia Paggio, Jens Allwood, Elisabeth Ahlsen, K...
ICMI
2010
Springer
152views Biometrics» more  ICMI 2010»
13 years 3 months ago
Vlogcast yourself: nonverbal behavior and attention in social media
We introduce vlogs as a type of rich human interaction which is multimodal in nature and suitable for new largescale behavioral data analysis. The automatic analysis of vlogs is u...
Joan-Isaac Biel, Daniel Gatica-Perez
LRE
2008
174views more  LRE 2008»
13 years 5 months ago
IEMOCAP: interactive emotional dyadic motion capture database
Since emotions are expressed through a combination of verbal and non-verbal channels, a joint analysis of speech and gestures is required to understand expressive human communicati...
Carlos Busso, Murtaza Bulut, Chi-Chun Lee, Abe Kaz...