Sciweavers

Share
CVPR
2005
IEEE

Jensen-Shannon Boosting Learning for Object Recognition

11 years 10 months ago
Jensen-Shannon Boosting Learning for Object Recognition
In this paper, we propose a novel learning method, called Jensen-Shannon Boosting (JSBoost) and demonstrate its application to object recognition. JSBoost incorporates Jensen-Shannon (JS) divergence [2] into AdaBoost learning. JS divergence is advantageous in that it provides more appropriate measure of dissimilarity between two classes and it is numerically more stable than other measures such as Kullback-Leibler (KL) divergence (see [2]). The best features are iteratively learned by maximizing the projected JS divergence, based on which best weak classifiers are derived. The weak classifiers are combined into a strong one by minimizing the recognition error. JSBoost learning is demonstrated with face object recognition using a local binary pattern (LBP) [13] based representation. JSBoost selects the best LBP features from thousands of candidate features and constructs a strong classifier based on the selected features. JSBoost empirically produces better face recognition results ...
Xiangsheng Huang, Stan Z. Li, Yangsheng Wang
Added 24 Jun 2010
Updated 24 Jun 2010
Type Conference
Year 2005
Where CVPR
Authors Xiangsheng Huang, Stan Z. Li, Yangsheng Wang
Comments (0)
books