Sciweavers

262 search results - page 37 / 53
» The Integrality of Speech in Multimodal Interfaces
Sort
View
WWW
2004
ACM
16 years 14 days ago
A generic uiml vocabulary for device- and modality independent user interfaces
We present in this poster our work on a User Interface Markup Language (UIML) vocabulary for the specification of device- and modality independent user interfaces. The work presen...
Rainer Simon, Michael Jank, Florian Wegscheider
AI
2007
Springer
14 years 12 months ago
Head gestures for perceptual interfaces: The role of context in improving recognition
Head pose and gesture offer several conversational grounding cues and are used extensively in face-to-face interaction among people. To recognize visual feedback efficiently, hum...
Louis-Philippe Morency, Candace L. Sidner, Christo...
LREC
2010
161views Education» more  LREC 2010»
15 years 1 months ago
Providing Multilingual, Multimodal Answers to Lexical Database Queries
Language users are increasingly turning to electronic resources to address their lexical information needs, due to their convenience and their ability to simultaneously capture di...
Gerard de Melo, Gerhard Weikum
TASLP
2008
115views more  TASLP 2008»
14 years 11 months ago
Recognition of Dialogue Acts in Multiparty Meetings Using a Switching DBN
Abstract--This paper is concerned with the automatic recognition of dialogue acts (DAs) in multiparty conversational speech. We present a joint generative model for DA recognition ...
Alfred Dielmann, Steve Renals
SPEECH
2008
109views more  SPEECH 2008»
14 years 11 months ago
DialogStudio: A workbench for data-driven spoken dialog system development and management
Recently, data-driven speech technologies have been widely used to build speech user interfaces. However, developing and managing data-driven spoken dialog systems are laborious a...
Sangkeun Jung, Cheongjae Lee, Seokhwan Kim, Gary G...