Sciweavers

82 search results - page 12 / 17
» Interpreting Hand-Over-Face Gestures
Sort
View
LRE
2008
174views more  LRE 2008»
14 years 11 months ago
IEMOCAP: interactive emotional dyadic motion capture database
Since emotions are expressed through a combination of verbal and non-verbal channels, a joint analysis of speech and gestures is required to understand expressive human communicati...
Carlos Busso, Murtaza Bulut, Chi-Chun Lee, Abe Kaz...
AIEDU
2005
89views more  AIEDU 2005»
14 years 11 months ago
An Evaluation of a Hybrid Language Understanding Approach for Robust Selection of Tutoring Goals
In this paper, we explore the problem of selecting appropriate interventions for students based on an analysis of their interactions with a tutoring system. In the context of the W...
Carolyn Penstein Rosé, Kurt VanLehn
AAI
1999
147views more  AAI 1999»
14 years 11 months ago
Animated Agents for Procedural Training in Virtual Reality: Perception, Cognition, and Motor Control
This paper describes Steve, an animated agent that helps students learn to perform physical, procedural tasks. The student and Steve cohabit a three-dimensional, simulated mock-up...
Jeff Rickel, W. Lewis Johnson
CHI
2009
ACM
16 years 8 days ago
Musink: composing music through augmented drawing
We focus on the creative use of paper in the music composition process, particularly the interaction between paper and end-user programming. When expressing musical ideas, compose...
Theophanis Tsandilas, Catherine Letondal, Wendy E....
ISMAR
2006
IEEE
15 years 5 months ago
"Move the couch where?" : developing an augmented reality multimodal interface
This paper describes an augmented reality (AR) multimodal interface that uses speech and paddle gestures for interaction. The application allows users to intuitively arrange virtu...
Sylvia Irawati, Scott Green, Mark Billinghurst, An...