In this paper we introduce a system that automatically adds different types of non-verbal behavior to a given dialogue script between two virtual embodied agents. It allows us to ...
Werner Breitfuss, Helmut Prendinger, Mitsuru Ishiz...
This research proposes a computational framework for generating visual attending behavior in an embodied simulated human agent. Such behaviors directly control eye and head motion...
Embodied agents present ongoing challenging agenda for research in multi-modal user interfaces and humancomputer-interaction. Such agent metaphors will only be widely applicable t...
Currently, state of the art virtual agents lack the ability to display emotion as seen in actual humans, or even in hand-animated characters. One reason for the emotional inexpres...
The goal of the Virtual Humans Project at the University of Southern California’s Institute for Creative Technologies is to enrich virtual training environments with virtual hum...
Patrick G. Kenny, Arno Hartholt, Jonathan Gratch, ...