Sciweavers

25 search results - page 2 / 5
» Multimodal information fusion for human-robot interaction
Sort
View
METMBS
2004
174views Mathematics» more  METMBS 2004»
13 years 6 months ago
Med-LIFE: A Diagnostic Aid for Medical Imagery
We present a system known as Med-LIFE (Medical application of Learning, Image Fusion, and Exploration) currently under development for medical image analysis. This pipelined syste...
Joshua R. New, Erion Hasanbelliu, Mario Aguilar
DSVIS
2005
Springer
13 years 10 months ago
Test of the ICARE Platform Fusion Mechanism
Multimodal interactive systems offer a flexibility of interaction that increases their complexity. ICARE is a component-based approach to specify and develop multimodal interfaces...
Sophie Dupuy-Chessa, Lydie du Bousquet, Jullien Bo...
ICMI
2010
Springer
217views Biometrics» more  ICMI 2010»
13 years 2 months ago
Focusing computational visual attention in multi-modal human-robot interaction
Identifying verbally and non-verbally referred-to objects is an important aspect of human-robot interaction. Most importantly, it is essential to achieve a joint focus of attentio...
Boris Schauerte, Gernot A. Fink
AROBOTS
2002
166views more  AROBOTS 2002»
13 years 4 months ago
Multi-Modal Interaction of Human and Home Robot in the Context of Room Map Generation
In robotics, the idea of human and robot interaction is receiving a lot of attention lately. In this paper, we describe a multi-modal system for generating a map of the environment...
Saeed Shiry Ghidary, Yasushi Nakata, Hiroshi Saito...
AMS
2005
Springer
143views Robotics» more  AMS 2005»
13 years 10 months ago
Integration of a Sound Source Detection into a Probabilistic-based Multimodal Approach for Person Detection and Tracking
Abstract. Dealing with methods of Human-Robot-Interaction and using a real mobile robot, stable methods for people detection and tracking are fundamental features of such a system ...
Robert Brückmann, Andrea Scheidig, Christian ...