Sciweavers

587 search results - page 75 / 118
» A Gesture Interface for Human-Robot-Interaction
Sort
View
CHI
2010
ACM
15 years 5 months ago
Manual deskterity: an exploration of simultaneous pen + touch direct input
Manual Deskterity is a prototype digital drafting table that supports both pen and touch input. We explore a division of labor between pen and touch that flows from natural human ...
Ken Hinckley, Koji Yatani, Michel Pahud, Nicole Co...
TSD
2007
Springer
15 years 6 months ago
ECAF: Authoring Language for Embodied Conversational Agents
Abstract. Embodied Conversational Agent (ECA) is the user interface metaphor that allows to naturally communicate information during human-computer interaction in synergic modality...
Ladislav Kunc, Jan Kleindienst
IUI
2005
ACM
15 years 5 months ago
Interaction techniques using prosodic features of speech and audio localization
We describe several approaches for using prosodic features of speech and audio localization to control interactive applications. This information can be applied to parameter contr...
Alex Olwal, Steven Feiner
UIST
2005
ACM
15 years 5 months ago
Physical embodiments for mobile communication agents
This paper describes a physically embodied and animated user interface to an interactive call handling agent, consisting of a small wireless animatronic device in the form of a sq...
Stefan Marti, Chris Schmandt
ICMI
2009
Springer
95views Biometrics» more  ICMI 2009»
15 years 6 months ago
Salience in the generation of multimodal referring acts
Pointing combined with verbal referring is one of the most paradigmatic human multimodal behaviours. The aim of this paper is foundational: to uncover the central notions that are...
Paul Piwek