Sciweavers

WACV
2005
IEEE

Epipolar Constraints for Vision-Aided Inertial Navigation

13 years 9 months ago
Epipolar Constraints for Vision-Aided Inertial Navigation
— This paper describes a new method to improve inertial navigation using feature-based constraints from one or more video cameras. The proposed method lengthens the period of time during which a human or vehicle can navigate in GPS-deprived environments. Our approach integrates well with existing navigation systems, because we invoke general sensor models that represent a wide range of available hardware. The inertial model includes errors in bias, scale, and random walk. Any purely projective camera and tracking algorithm may be used, as long as the tracking output can be expressed as ray vectors extending from known locations on the sensor body. A modified linear Kalman filter performs the data fusion. Unlike traditional SLAM, our state vector contains only inertial sensor errors related to position. This choice allows uncertainty to be properly represented by a covariance matrix. We do not augment the state with feature coordinates. Instead, image data contributes stochastic epi...
David D. Diel, Paul DeBitetto, Seth J. Teller
Added 25 Jun 2010
Updated 25 Jun 2010
Type Conference
Year 2005
Where WACV
Authors David D. Diel, Paul DeBitetto, Seth J. Teller
Comments (0)