Sciweavers

ICML
1999
IEEE

AdaCost: Misclassification Cost-Sensitive Boosting

14 years 5 months ago
AdaCost: Misclassification Cost-Sensitive Boosting
AdaCost, a variant of AdaBoost, is a misclassification cost-sensitive boosting method. It uses the cost of misclassifications to update the training distribution on successive boosting rounds. The purpose is to reduce the cumulative misclassification cost more than AdaBoost. We formally show that AdaCost reduces the upper bound of cumulative misclassification cost of the training set. Empirical evaluations have shown significant reduction in the cumulative misclassification cost over AdaBoost without consuming additional computing power.
Wei Fan, Salvatore J. Stolfo, Junxin Zhang, Philip
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 1999
Where ICML
Authors Wei Fan, Salvatore J. Stolfo, Junxin Zhang, Philip K. Chan
Comments (0)