Sciweavers

2697 search results - page 31 / 540
» Developing Gestural Input
Sort
View
GRAPHICSINTERFACE
2009
14 years 11 months ago
Separability of spatial manipulations in multi-touch interfaces
Multi-touch interfaces allow users to translate, rotate, and scale digital objects in a single interaction. However, this freedom represents a problem when users intend to perform...
Miguel A. Nacenta, Patrick Baudisch, Hrvoje Benko,...
IUI
2000
ACM
15 years 6 months ago
Expression constraints in multimodal human-computer interaction
Thanks to recent scientific advances, it is now possible to design multimodal interfaces allowing the use of speech and pointing out gestures on a touchscreen. However, present sp...
Sandrine Robbe-Reiter, Noelle Carbonell, Pierre Da...
ICASSP
2011
IEEE
14 years 5 months ago
Dynamics of tongue gestures extracted automatically from ultrasound
We describe a system for automatically extracting dynamics of tongue gestures from ultrasound images of the tongue using translational deep belief networks (tDBNs). In tDBNs, a jo...
Jeff Berry, Ian Fasel
CHI
2007
ACM
16 years 2 months ago
A motion-based marking menu system
The rapid development of handheld devices is driving the development of new interaction styles. This paper examines one such technique: using hand motions to control a menu system...
Ian Oakley, Junseok Park
ICCS
2004
Springer
15 years 7 months ago
Collaborative Integration of Speech and 3D Gesture for Map-Based Applications
QuickSet [6] is a multimodal system that gives users the capability to create and control map-based collaborative interactive simulations by supporting the simultaneous input from ...
Andrea Corradini