Sciweavers

AIED
2007
Springer

Mind and Body: Dialogue and Posture for Affect Detection in Learning Environments

13 years 10 months ago
Mind and Body: Dialogue and Posture for Affect Detection in Learning Environments
We investigated the potential of automatic detection of a learner’s affective states from posture patterns and dialogue features obtained from an interaction with AutoTutor, an intelligent tutoring system with conversational dialogue. Training and validation data were collected from the sensors in a learning session with AutoTutor, after which the affective states of the learner were rated by the learner, a peer, and two trained judges. Machine learning experiments with several standard classifiers indicated that the dialogue and posture features could individually discriminate between the affective states of boredom, confusion, flow (engagement), and frustration. Our results also indicate that a combination of the dialogue and posture features does improve classification accuracy. However, the incremental gains associated with the combination of the two sensors were not sufficient to exhibit superadditivity (i.e., performance superior to an additive combination of individual channel...
Sidney K. D'Mello, Arthur C. Graesser
Added 07 Jun 2010
Updated 07 Jun 2010
Type Conference
Year 2007
Where AIED
Authors Sidney K. D'Mello, Arthur C. Graesser
Comments (0)