Sciweavers

Share
ISMAR
2003
IEEE

Real-Time Localisation and Mapping with Wearable Active Vision

8 years 8 months ago
Real-Time Localisation and Mapping with Wearable Active Vision
We present a general method for real-time, visiononly single-camera simultaneous localisation and mapping (SLAM) — an algorithm which is applicable to the localisation of any camera moving through a scene — and study its application to the localisation of a wearable robot with active vision. Starting from very sparse initial scene knowledge, a map of natural point features spanning a section of a room is generated on-the-fly as the motion of the camera is simultaneously estimated in full 3D. Naturally this permits the annotation of the scene with rigidly-registered graphics, but further it permits automatic control of the robot’s active camera: for instance, fixation on a particular object can be maintained during extended periods of arbitrary user motion, then shifted at will to another object which has potentially been out of the field of view. This kind of functionality is the key to the understanding or “management” of a workspace which the robot needs to have in orde...
Andrew J. Davison, Walterio W. Mayol-Cuevas, David
Added 04 Jul 2010
Updated 04 Jul 2010
Type Conference
Year 2003
Where ISMAR
Authors Andrew J. Davison, Walterio W. Mayol-Cuevas, David W. Murray
Comments (0)
books