Sciweavers

12 search results - page 2 / 3
» Speech Gesture Interface to a Visual-Computing Environment
Sort
View
CHI
2007
ACM
14 years 6 months ago
How pairs interact over a multimodal digital table
Co-located collaborators often work over physical tabletops using combinations of expressive hand gestures and verbal utterances. This paper provides the first observations of how...
Edward Tse, Chia Shen, Saul Greenberg, Clifton For...
IUI
2000
ACM
13 years 10 months ago
Expression constraints in multimodal human-computer interaction
Thanks to recent scientific advances, it is now possible to design multimodal interfaces allowing the use of speech and pointing out gestures on a touchscreen. However, present sp...
Sandrine Robbe-Reiter, Noelle Carbonell, Pierre Da...
WSCG
2004
242views more  WSCG 2004»
13 years 7 months ago
Hand Gesture Recognition for Human-Machine Interaction
Even after more than two decades of input devices development, many people still find the interaction with computers an uncomfortable experience. Efforts should be made to adapt c...
Elena Sánchez-Nielsen, Luis Antón-Ca...
ISWC
1999
IEEE
13 years 10 months ago
Smart Sight: A Tourist Assistant System
In this paper, we present our e orts towards developing an intelligent tourist system. The system is equipped with a unique combination of sensors and software. The hardware inclu...
Jie Yang, Weiyi Yang, Matthias Denecke, Alex Waibe...
CIE
2007
Springer
13 years 6 months ago
Multimodal multiplayer tabletop gaming
There is a large disparity between the rich physical interfaces of co-located arcade games and the generic input devices seen in most home console systems. In this paper we argue ...
Edward Tse, Saul Greenberg, Chia Shen, Clifton For...