Sciweavers

262 search results - page 43 / 53
» The Integrality of Speech in Multimodal Interfaces
Sort
View
HAPTICS
2002
IEEE
15 years 4 months ago
Autostereoscopic and Haptic Visualization for Space Exploration and Mission Design
We have developed a multi-modal virtual environment set-up by fusing visual and haptic images through the use of a new autostereoscopic display and a force-feedback haptic device....
Cagatay Basdogan, Mitchell J. H. Lum, Jose Salcedo...
LCR
1998
Springer
137views System Software» more  LCR 1998»
15 years 4 months ago
Integrated Task and Data Parallel Support for Dynamic Applications
There is an emerging class of real-time interactive applications that require the dynamic integration of task and data parallelism. An example is the Smart Kiosk, a free-standing ...
James M. Rehg, Kathleen Knobe, Umakishore Ramachan...
ANLP
1997
92views more  ANLP 1997»
15 years 1 months ago
CommandTalk: A Spoken-Language Interface for Battlefield Simulations
CommandTalk is a spoken-language interface to battlefield simulations that allows the use of ordinary spoken English to create forces and control measures, assign missions to forc...
Robert C. Moore, John Dowding, Harry Bratt, Jean M...
CASES
2003
ACM
15 years 5 months ago
A low-power accelerator for the SPHINX 3 speech recognition system
Accurate real-time speech recognition is not currently possible in the mobile embedded space where the need for natural voice interfaces is clearly important. The continuous natur...
Binu K. Mathew, Al Davis, Zhen Fang
CHI
1999
ACM
15 years 4 months ago
Chat Circles
Although current online chat environments provide new opportunities for communication, they are quite constrained in their ability to convey many important pieces of social inform...
Fernanda B. Viégas, Judith S. Donath