Sciweavers

54 search results - page 1 / 11
» Distributed speech processing in miPad's multimodal user int...
Sort
View
TASLP
2002
143views more  TASLP 2002»
14 years 9 months ago
Distributed speech processing in miPad's multimodal user interface
This paper describes the main components of MiPad (Multimodal Interactive PAD) and especially its distributed speech processing aspects. MiPad is a wireless mobile PDA prototype th...
Li Deng, Kuansan Wang, Alex Acero, Hsiao-Wuen Hon,...
LREC
2010
206views Education» more  LREC 2010»
14 years 11 months ago
DICIT: Evaluation of a Distant-talking Speech Interface for Television
The EC-funded project DICIT developed distant-talking interfaces for interactive TV. The final DICIT prototype system processes multimodal user input by speech and remote control....
Timo Sowa, Fiorenza Arisio, Luca Cristoforetti
SIGIR
1999
ACM
15 years 1 months ago
SCAN: Designing and Evaluating User Interfaces to Support Retrieval From Speech Archives
Previous examinations of search in textual archives have assumed that users first retrieve a ranked set of documents relevant to their query, and then visually scan through these ...
Steve Whittaker, Julia Hirschberg, John Choi, Dona...
AIHC
2007
Springer
15 years 3 months ago
Gaze-X: Adaptive, Affective, Multimodal Interface for Single-User Office Scenarios
This paper describes an intelligent system that we developed to support affective multimodal human-computer interaction (AMM-HCI) where the user’s actions and emotions are modele...
Ludo Maat, Maja Pantic
LREC
2008
101views Education» more  LREC 2008»
14 years 11 months ago
Glossa: a Multilingual, Multimodal, Configurable User Interface
We describe a web-based corpus query system, Glossa, which combines the expressiveness of regular query languages with the user-friendliness of a graphical interface. Since corpus...
Lars Nygaard, Joel Priestley, Anders Nøkles...