Sciweavers

CVPR
2003
IEEE

Kullback-Leibler Boosting

14 years 5 months ago
Kullback-Leibler Boosting
In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting has following properties. First, classification is based on the sum of histogram divergences along corresponding global and discriminating linear features. Second, these linear features, called KL features, are iteratively learnt by maximizing the projected Kullback-Leibler divergence in a boosting manner. Third, the coefficients to combine the histogram divergences are learnt by minimizing the recognition error once a new feature is added to the classifier. This contrasts conventional AdaBoost where the coefficients are empirically set. Because of these properties, KLBoosting classifier generalizes very well. Moreover, to apply KLBoosting to high-dimensional image space, we propose a data-driven Kullback-Leibler Analysis (KLA) approach to find KL features for image objects (e.g., face patches). Promising experimental results on face detection demonstrate the effect...
Ce Liu, Heung-Yeung Shum
Added 12 Oct 2009
Updated 12 Oct 2009
Type Conference
Year 2003
Where CVPR
Authors Ce Liu, Heung-Yeung Shum
Comments (0)