Sciweavers

52 search results - page 3 / 11
» The Role of Speech in Multimodal Human-Computer Interaction
Sort
View
ACMDIS
2008
ACM
13 years 7 months ago
Exploring true multi-user multimodal interaction over a digital table
True multi-user, multimodal interaction over a digital table lets co-located people simultaneously gesture and speak commands to control an application. We explore this design spa...
Edward Tse, Saul Greenberg, Chia Shen, Clifton For...
CHI
2007
ACM
14 years 5 months ago
How pairs interact over a multimodal digital table
Co-located collaborators often work over physical tabletops using combinations of expressive hand gestures and verbal utterances. This paper provides the first observations of how...
Edward Tse, Chia Shen, Saul Greenberg, Clifton For...
CHI
2002
ACM
14 years 5 months ago
Multimodal theater: extending low fidelity paper prototyping to multimodal applications
Low-fidelity paper prototyping has proven to be a useful technique for designing for graphical user interfaces [1]. Wizard of Oz prototyping for other input modalities, such as sp...
Corey D. Chandler, Gloria Lo, Anoop K. Sinha
MHCI
2009
Springer
14 years 3 hour ago
SiMPE: Fourth Workshop on Speech in Mobile and Pervasive Environments
With the proliferation of pervasive devices and the increase in their processing capabilities, client-side speech processing has been emerging as a viable alternative. SiMPE 2009,...
Amit Anil Nanavati, Nitendra Rajput, Alexander I. ...
CHI
2004
ACM
14 years 5 months ago
ICARE: a component-based approach for the design and development of multimodal interfaces
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in ...
Jullien Bouchet, Laurence Nigay