Wrapping Boosters against Noise

12 years 10 months ago
Wrapping Boosters against Noise
Abstract. Wrappers have recently been used to obtain parameter optimizations for learning algorithms. In this paper we investigate the use of a wrapper for estimating the correct number of boosting ensembles in the presence of class noise. Contrary to the naive approach that would be quadratic in the number of boosting iterations, the incremental algorithm described is linear. Additionally, directly using the k-sized ensembles generated during k-fold cross-validation search for prediction usually results in further improvements in classification performance. This improvement can be attributed to the reduction of variance due to averaging k ensembles instead of using only one ensemble. Consequently, cross-validation in the way we use it here, termed wrapping, can be viewed as yet another ensemble learner similar in spirit to bagging but also somewhat related to stacking.
Bernhard Pfahringer, Geoffrey Holmes, Gabi Schmidb
Added 28 Jul 2010
Updated 28 Jul 2010
Type Conference
Year 2001
Authors Bernhard Pfahringer, Geoffrey Holmes, Gabi Schmidberger
Comments (0)