Sciweavers

Share
NECO
2006

Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression

10 years 2 months ago
Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression
The application of boosting technique to the regression problems has received relatively little attention in contrast to the research aimed at classification problems. This paper describes a new boosting algorithm AdaBoost.RT for regression problems. Its idea is in filtering out the examples with the relative estimation error that is higher than the preset threshold value, and then following the AdaBoost procedure. Thus it requires selecting the sub-optimal value of the error threshold to demarcate examples as poorly or well predicted. Some experimental results using M5 model tree as a weak learning machine for several benchmark data sets are reported. The results are compared to other boosting methods, bagging, artificial neural networks, and to a single M5 model tree. The preliminary empirical comparisons show higher performance of AdaBoost.RT for most of the considered data sets. NC ms 3101Final submission, November 9, 2005 (Neural Computaion)
Durga L. Shrestha, Dimitri P. Solomatine
Added 14 Dec 2010
Updated 14 Dec 2010
Type Journal
Year 2006
Where NECO
Authors Durga L. Shrestha, Dimitri P. Solomatine
Comments (0)
books