On-the-fly Object Modeling while Tracking

14 years 8 months ago
On-the-fly Object Modeling while Tracking
To implement a persistent tracker, we build a set of viewdependent object appearance models adaptively and automatically while tracking an object under different viewing angles. This collection of acquired models is indexed with respect to the view sphere. The acquired models aid recovery from tracking failure due to occlusion and changing view angle. In this paper, view-dependent object appearance is represented by intensity patches around detected Harris corners. The intensity patches from a model are matched to the current frame by solving a bipartite linear assignment problem with outlier exclusion and missed inlier recovery. Based on these reliable matches, the change in object rotation, translation and scale is estimated between consecutive frames using Procrustes analysis. The experimental results show good performance using a collection of view-specific patch-based models for detection and tracking of vehicles in low-resolution airborne video.
Zhaozheng Yin, Robert T. Collins
Added 12 Oct 2009
Updated 28 Oct 2009
Type Conference
Year 2007
Where CVPR
Authors Zhaozheng Yin, Robert T. Collins
Comments (0)