Sciweavers

262 search results - page 8 / 53
» The Integrality of Speech in Multimodal Interfaces
Sort
View
IUI
2000
ACM
15 years 4 months ago
Expression constraints in multimodal human-computer interaction
Thanks to recent scientific advances, it is now possible to design multimodal interfaces allowing the use of speech and pointing out gestures on a touchscreen. However, present sp...
Sandrine Robbe-Reiter, Noelle Carbonell, Pierre Da...
COLING
1994
15 years 1 months ago
Drawing Pictures with Natural Language and Direct Manipulation
A multimodal user interface allows users to communicate with computers using multiple modalities, such as a mouse, a keyboard or voice, in various combined ways. This paper discus...
Mayumi Hiyoshi, Hideo Shimazu
SIGIR
1999
ACM
15 years 4 months ago
SCAN: Designing and Evaluating User Interfaces to Support Retrieval From Speech Archives
Previous examinations of search in textual archives have assumed that users first retrieve a ranked set of documents relevant to their query, and then visually scan through these ...
Steve Whittaker, Julia Hirschberg, John Choi, Dona...
NAACL
2007
15 years 1 months ago
The CALO Meeting Assistant
The CALO Meeting Assistant is a multimodal meeting assistant technology that integrates speech, gestures, and multimodal data collected from multiparty interactions during meetings...
L. Lynn Voss, Patrick Ehlen
IUI
2004
ACM
15 years 5 months ago
Speech and sketching for multimodal design
While sketches are commonly and effectively used in the early stages of design, some information is far more easily conveyed verbally than by sketching. In response, we have combi...
Aaron Adler, Randall Davis