Asymmetric Real Adaboost

10 years 9 months ago
Asymmetric Real Adaboost
A cost-sensitive extension of Real Adaboost denoted as asymmetric Real Adaboost(RAB) is proposed. The two main differences between Asymmetric RAB and the na¨ıve RAB are (1) a Chernoff measurement is used to evaluate the best weak classifier during training, rather than a Bhattacharyya measurement used in na¨ıve RAB, and (2) the weights are updated separately for positives and negatives at each boosting step. The upper bound on training error is also provided. Experiment results are shown to demonstrate its cost-sensitivity when selecting weak classifiers, and also show that it outperforms previously proposed cost-sensitive extensions of Discrete Adaboost(DAB) and several extensions of Real Adaboost. Besides, it also consumes much less time than previously proposed DAB extensions.
Zhanjun Wang, Chi Fang, Xiaoqing Ding
Added 30 May 2010
Updated 30 May 2010
Type Conference
Year 2008
Where ICPR
Authors Zhanjun Wang, Chi Fang, Xiaoqing Ding
Comments (0)