Sciweavers

521 search results - page 43 / 105
» Affective multimodal human-computer interaction
Sort
View
CHI
2008
ACM
16 years 8 days ago
Reality-based interaction: a framework for post-WIMP interfaces
We are in the midst of an explosion of emerging humancomputer interaction techniques that redefine our understanding of both computers and interaction. We propose the notion of Re...
Robert J. K. Jacob, Audrey Girouard, Leanne M. Hir...
HCI
2009
14 years 9 months ago
Did I Get It Right: Head Gestures Analysis for Human-Machine Interactions
This paper presents a system for another input modality in a multimodal human-machine interaction scenario. In addition to other common input modalities, e.g. speech, we extract he...
Jürgen Gast, Alexander Bannat, Tobias Rehrl, ...
HAPTICS
2005
IEEE
15 years 5 months ago
Multi-Modal Perceptualization of Volumetric Data and Its Application to Molecular Docking
In this paper, we present a multi-modal data perceptualization system used to analyze the benefits of augmenting a volume docking problem with other perceptual cues, particularly...
Ross Maciejewski, Seungmoon Choi, David S. Ebert, ...
HAPTICS
2005
IEEE
15 years 5 months ago
A Haptic Enabled Multimodal Pre-operative Planner for Hip Arthroplasty
This paper introduces the Multisense idea, with a special reference to the use of Haptics in the medical field and, in particular, in the planning of total hip replacement surgery...
Silvano Imboden, Marco Petrone, Paolo Quadrani, Ci...
ACHI
2009
IEEE
15 years 6 months ago
A Structured Approach to Support 3D User Interface Development
— Given its current state of the art, Model-Based UI Development (MBDUI) is able to fulfill the major requirements of desktop and mobile applications, such as form-based user int...
Juan Manuel González-Calleros, Jean Vanderd...