Brain-computer interfaces (BCIs), as any other interaction modality based on physiological signals and body channels (e.g., muscular activity, speech and gestures), are prone to e...
In this paper we introduce a system that automatically adds different types of non-verbal behavior to a given dialogue script between two virtual embodied agents. It allows us to ...
Werner Breitfuss, Helmut Prendinger, Mitsuru Ishiz...
In this paper we present our work toward the creation of a multimodal expressive Embodied Conversational Agent (ECA). Our agent, called Greta, exhibits nonverbal behaviors synchro...
When working with sample-based media, a performer is managing timelines, loop points, sample parameters and effects parameters. The Slidepipe is a performance controller that give...
We present rush as a recommendation-based interaction and visualization technique for repeated item selection from large data sets on mobile touch screen devices. Proposals and ch...