Sciweavers

2697 search results - page 73 / 540
» Developing Gestural Input
Sort
View
GRAPHICSINTERFACE
2009
15 years 2 months ago
Determining the benefits of direct-touch, bimanual, and multifinger input on a multitouch workstation
Multitouch workstations support direct-touch, bimanual, and multifinger interaction. Previous studies have separately examined the benefits of these three interaction attributes o...
Kenrick Kin, Maneesh Agrawala, Tony DeRose
COLING
2000
15 years 5 months ago
Finite-state Multimodal Parsing and Understanding
Multimodal interfaces require effective parsing and nn(lerstanding of utterances whose content is distributed across multiple input modes. Johnston 1998 presents an approach in wh...
Michael Johnston, Srinivas Bangalore
CHI
2008
ACM
16 years 4 months ago
Exploring the use of tangible user interfaces for human-robot interaction: a comparative study
In this paper we suggest the use of tangible user interfaces (TUIs) for human-robot interaction (HRI) applications. We discuss the potential benefits of this approach while focusi...
Cheng Guo, Ehud Sharlin
CHI
2006
ACM
16 years 4 months ago
Hover widgets: using the tracking state to extend the capabilities of pen-operated devices
We present Hover Widgets, a new technique for increasing the capabilities of pen-based interfaces. Hover Widgets are implemented by using the pen movements above the display surfa...
Tovi Grossman, Ken Hinckley, Patrick Baudisch, Man...
ICMI
2007
Springer
262views Biometrics» more  ICMI 2007»
15 years 10 months ago
Automated generation of non-verbal behavior for virtual embodied characters
In this paper we introduce a system that automatically adds different types of non-verbal behavior to a given dialogue script between two virtual embodied agents. It allows us to ...
Werner Breitfuss, Helmut Prendinger, Mitsuru Ishiz...