Sciweavers

102 search results - page 2 / 21
» A multisensory integration model of human stance control
Sort
View
DAGM
2003
Springer
13 years 10 months ago
A Computational Model of Early Auditory-Visual Integration
We introduce a computational model of sensor fusion based on the topographic representations of a ”two-microphone and one camera” configuration. Our aim is to perform a robust...
Carsten Schauer, Horst-Michael Gross
ICMI
2004
Springer
148views Biometrics» more  ICMI 2004»
13 years 10 months ago
A framework for evaluating multimodal integration by humans and a role for embodied conversational agents
One of the implicit assumptions of multi-modal interfaces is that human-computer interaction is significantly facilitated by providing multiple input and output modalities. Surpri...
Dominic W. Massaro
IJRR
2006
123views more  IJRR 2006»
13 years 5 months ago
Hybrid Control of the Berkeley Lower Extremity Exoskeleton (BLEEX)
The first functional load-carrying and energetically autonomous exoskeleton was demonstrated at U.C. Berkeley, walking at the average speed of 0.9 m/s (2 mph) while carrying a 34 ...
Hami Kazerooni, Ryan Steger, Lihua Huang
ROMAN
2007
IEEE
131views Robotics» more  ROMAN 2007»
13 years 11 months ago
Real-time acoustic source localization in noisy environments for human-robot multimodal interaction
— Interaction between humans involves a plethora of sensory information, both in the form of explicit communication as well as more subtle unconsciously perceived signals. In ord...
Vlad M. Trifa, Ansgar Koene, Jan Morén, Gor...
IROS
2007
IEEE
152views Robotics» more  IROS 2007»
13 years 11 months ago
Integral control of humanoid balance
— This paper presents a balance controller that allows a humanoid to recover from large disturbances and still maintain an upright posture. Balance is achieved by integral contro...
Benjamin Stephens