Sciweavers

106 search results - page 3 / 22
» Multimodal event parsing for intelligent user interfaces
Sort
View
AI
2007
Springer
13 years 6 months ago
Head gestures for perceptual interfaces: The role of context in improving recognition
Head pose and gesture offer several conversational grounding cues and are used extensively in face-to-face interaction among people. To recognize visual feedback efficiently, hum...
Louis-Philippe Morency, Candace L. Sidner, Christo...
ESWS
2010
Springer
13 years 6 months ago
Efficient Semantic Event Processing: Lessons Learned in User Interface Integration
Abstract. Most approaches to application integration require an unambiguous exchange of events. Ontologies can be used to annotate the events exchanged and thus ensure a common und...
Heiko Paulheim
AIHC
2007
Springer
14 years 10 days ago
Gaze-X: Adaptive, Affective, Multimodal Interface for Single-User Office Scenarios
This paper describes an intelligent system that we developed to support affective multimodal human-computer interaction (AMM-HCI) where the user’s actions and emotions are modele...
Ludo Maat, Maja Pantic
WWW
2005
ACM
14 years 6 months ago
Introducing multimodal character agents into existing web applications
This paper proposes a framework in which end-users can instantaneously modify existing Web applications by introducing multimodal user-interface. The authors use the IntelligentPa...
Kimihito Ito
IUI
2010
ACM
14 years 3 months ago
Usage patterns and latent semantic analyses for task goal inference of multimodal user interactions
This paper describes our work in usage pattern analysis and development of a latent semantic analysis framework for interpreting multimodal user input consisting speech and pen ge...
Pui-Yu Hui, Wai Kit Lo, Helen M. Meng