Sciweavers

7 search results - page 1 / 2
» Integration of Visual and Inertial Information for Egomotion...
Sort
View
ICRA
2006
IEEE
110views Robotics» more  ICRA 2006»
13 years 10 months ago
Integration of Visual and Inertial Information for Egomotion: a Stochastic Approach
— We present a probabilistic framework for visual correspondence, inertial measurements and Egomotion. First, we describe a simple method based on Gabor filters to produce corre...
Justin Domke, Yiannis Aloimonos
WACV
2005
IEEE
13 years 10 months ago
Epipolar Constraints for Vision-Aided Inertial Navigation
— This paper describes a new method to improve inertial navigation using feature-based constraints from one or more video cameras. The proposed method lengthens the period of tim...
David D. Diel, Paul DeBitetto, Seth J. Teller
IVC
2007
115views more  IVC 2007»
13 years 4 months ago
Visual recognition of pointing gestures for human-robot interaction
In this paper, we present an approach for recognizing pointing gestures in the context of human–robot interaction. In order to obtain input features for gesture recognition, we ...
Kai Nickel, Rainer Stiefelhagen
ECCV
2004
Springer
13 years 9 months ago
Real-Time Person Tracking and Pointing Gesture Recognition for Human-Robot Interaction
In this paper, we present our approach for visual tracking of head, hands and head orientation. Given the images provided by a calibrated stereo-camera, color and disparity inform...
Kai Nickel, Rainer Stiefelhagen
CVPR
2008
IEEE
14 years 6 months ago
A mobile vision system for robust multi-person tracking
We present a mobile vision system for multi-person tracking in busy environments. Specifically, the system integrates continuous visual odometry computation with tracking-bydetect...
Andreas Ess, Bastian Leibe, Konrad Schindler, Luc ...