Real-Time Simultaneous Localisation and Mapping with a Single Camera

14 years 4 months ago
Real-Time Simultaneous Localisation and Mapping with a Single Camera
Ego-motion estimation for an agile single camera moving through general, unknown scenes becomes a much more challenging problem when real-time performance is required rather than under the off-line processing conditions under which most successful structure from motion work has been achieved. This task of estimating camera motion from measurements of a continuously expanding set of selfmapped visual features is one of a class of problems known as Simultaneous Localisation and Mapping (SLAM) in the robotics community, and we argue that such real-time mapping research, despite rarely being camera-based, is more relevant here than off-line structure from motion methods due to the more fundamental emphasis placed on propagation of uncertainty. We present a top-down Bayesian framework for singlecamera localisation via mapping of a sparse set of natural features using motion modelling and an informationguided active measurement strategy, in particular addressing the difficult issue of real-...
Andrew J. Davison
Added 15 Oct 2009
Updated 31 Oct 2009
Type Conference
Year 2003
Where ICCV
Authors Andrew J. Davison
Comments (0)