Sciweavers

TROB
2008

Fusing Monocular Information in Multicamera SLAM

13 years 4 months ago
Fusing Monocular Information in Multicamera SLAM
Abstract--This paper explores the possibilities of using monocular simultaneous localization and mapping (SLAM) algorithms in systems with more than one camera. The idea is to combine in a single system the advantages of both monocular vision (bearings-only, infinite range observations but no 3-D instantaneous information) and stereovision (3-D information up to a limited range). Such a system should be able to instantaneously map nearby objects while still considering the bearing information provided by the observation of remote ones. We do this by considering each camera as an independent sensor rather than the entire set as a monolithic supersensor. The visual data are treated by monocular methods and fused by the SLAM filter. Several advantages naturally arise as interesting possibilities, such as the desynchronization of the firing of the sensors, the use of several unequal cameras, self-calibration, and cooperative SLAM with several independently moving cameras. We validate the a...
Joan Solà, André Monin, Michel Devy,
Added 15 Dec 2010
Updated 15 Dec 2010
Type Journal
Year 2008
Where TROB
Authors Joan Solà, André Monin, Michel Devy, Teresa Vidal-Calleja
Comments (0)