This paper describes the results of three studies investigating an embodied agent that supports its interaction with the user by gazing at corresponding objects within its close e...
Johann Schrammel, Arjan Geven, Reinhard Sefelin, M...
In this paper we introduce a system that automatically adds different types of non-verbal behavior to a given dialogue script between two virtual embodied agents. It allows us to t...
Werner Breitfuss, Helmut Prendinger, Mitsuru Ishiz...
In this paper we introduce a system that automatically adds different types of non-verbal behavior to a given dialogue script between two virtual embodied agents. It allows us to ...
Werner Breitfuss, Helmut Prendinger, Mitsuru Ishiz...
We study how synchronized gaze, gesture and speech rendered by an embodied conversational agent can influence the flow of conversations in multiparty settings. We review a computa...
We present a virtual reality platform for developing and evaluating embodied models of cognitive development. The platform facilitates structuring of the learning agent, of its vi...