Sciweavers

54 search results - page 2 / 11
» Distributed speech processing in miPad's multimodal user int...
Sort
View
TSD
2005
Springer
13 years 10 months ago
The Role of Speech in Multimodal Human-Computer Interaction
Abstract. Natural audio-visual interface between human user and machine requires understanding of user’s audio-visual commands. This does not necessarily require full speech and ...
Hynek Hermansky, Petr Fousek, Mikko Lehtonen
ICMI
2003
Springer
150views Biometrics» more  ICMI 2003»
13 years 10 months ago
Capturing user tests in a multimodal, multidevice informal prototyping tool
Interaction designers are increasingly faced with the challenge of creating interfaces that incorporate multiple input modalities, such as pen and speech, and span multiple device...
Anoop K. Sinha, James A. Landay
MHCI
2009
Springer
13 years 11 months ago
SiMPE: Fourth Workshop on Speech in Mobile and Pervasive Environments
With the proliferation of pervasive devices and the increase in their processing capabilities, client-side speech processing has been emerging as a viable alternative. SiMPE 2009,...
Amit Anil Nanavati, Nitendra Rajput, Alexander I. ...
IUI
2000
ACM
13 years 9 months ago
Expression constraints in multimodal human-computer interaction
Thanks to recent scientific advances, it is now possible to design multimodal interfaces allowing the use of speech and pointing out gestures on a touchscreen. However, present sp...
Sandrine Robbe-Reiter, Noelle Carbonell, Pierre Da...
GW
1999
Springer
194views Biometrics» more  GW 1999»
13 years 9 months ago
Communicative Rhythm in Gesture and Speech
Led by the fundamental role that rhythms apparently play in speech and gestural communication among humans, this study was undertaken to substantiate a biologically motivated model...
Ipke Wachsmuth