Sciweavers

106 search results - page 2 / 22
» Multimodal event parsing for intelligent user interfaces
Sort
View
ICAT
2006
IEEE
15 years 4 months ago
An Evaluation of an Augmented Reality Multimodal Interface Using Speech and Paddle Gestures
This paper discusses an evaluation of an augmented reality (AR) multimodal interface that uses combined speech and paddle gestures for interaction with virtual objects in the real ...
Sylvia Irawati, Scott Green, Mark Billinghurst, An...
104
Voted
ACL
1992
14 years 11 months ago
The Representation of Multimodal User Interface Dialogues Using Discourse Pegs
The three-tiered discourse representation defined in (Luperfoy, 1991) is applied to multimodal humancomputer interface (HCI) dialogues. In the applied system the three tiers are (...
Susann LuperFoy
IUI
2005
ACM
15 years 4 months ago
A framework for designing intelligent task-oriented augmented reality user interfaces
A task-oriented space can benefit from an augmented reality interface that layers the existing tools and surfaces with useful information to make cooking more easy, safe and effic...
Leonardo Bonanni, Chia-Hsun Lee, Ted Selker
70
Voted
IUI
2004
ACM
15 years 4 months ago
Classifying and assessing tremor movements for applications in man-machine intelligent user interfaces
We introduce a new intelligent user interface (IUI) and, also, a new methodology to identify the fatigue state for healthy subjects. The fatigue state is determined by means of a ...
Dan Marius Dobrea, Horia-Nicolai L. Teodorescu
AIIA
1991
Springer
15 years 2 months ago
Knowledge-Based Media Coordination in Intelligent User Interfaces
Multimodal interfaces combining, e.g., natural language and graphics take advantage of both the individual strength of each communication mode and the fact that several modes can ...
Wolfgang Wahlster, Elisabeth André, Winfrie...