Sciweavers

Share
CVPR
2008
IEEE

Detection with multi-exit asymmetric boosting

10 years 2 months ago
Detection with multi-exit asymmetric boosting
We introduce a generalized representation for a boosted classifier with multiple exit nodes, and propose a method to training which combines the idea of propagating scores across boosted classifiers [14, 17] and the use of asymmetric goals [13]. A means for determining the ideal constant asymmetric goal is provided, which is theoretically justified under a conservative bound on the ROC operating point target and empirically near-optimal under the exact bound. Moreover, our method automatically minimizes the number of weak classifiers, avoiding the need to retrain a boosted classifier multiple times for empirical best performance as in conventional methods. Experimental results shows significant reduction in training time and number of weak classifiers, as well as better accuracy, compared to conventional cascades and multi-exit boosted classifiers.
Minh-Tri Pham, V-D. D. Hoang, Tat-Jen Cham
Added 12 Oct 2009
Updated 28 Oct 2009
Type Conference
Year 2008
Where CVPR
Authors Minh-Tri Pham, V-D. D. Hoang, Tat-Jen Cham
Comments (0)
books