Sciweavers

IROS
2008
IEEE

Homing in scale space

13 years 10 months ago
Homing in scale space
— Local visual homing is the process of determining the direction of movement required to return an agent to a goal location by comparing the current image with an image taken at the goal, known as the snapshot image. One way of accomplishing visual homing is by computing the correspondences between features and then analyzing the resulting flow field to determine the correct direction of motion. Typically, some strong assumptions need to be posited in order to compute the home direction from the flow field. For example, it is difficult to locally distinguish translation from rotation, so many authors assume rotation to be computable by other means (e.g. magnetic compass). In this paper we present a novel approach to visual homing using scale change information from Scale Invariant Feature Transforms (SIFT) which we use to compute landmark correspondences. The method described here is able to determine the direction of the goal in the robot’s frame of reference, irrespective o...
David Churchill, Andrew Vardy
Added 31 May 2010
Updated 31 May 2010
Type Conference
Year 2008
Where IROS
Authors David Churchill, Andrew Vardy
Comments (0)