Sciweavers

1860 search results - page 2 / 372
» Boosting Methods for Regression
Sort
View
FLAIRS
2004
13 years 6 months ago
Random Subspacing for Regression Ensembles
In this work we present a novel approach to ensemble learning for regression models, by combining the ensemble generation technique of random subspace method with the ensemble int...
Niall Rooney, David W. Patterson, Sarab S. Anand, ...
IDEAL
2004
Springer
13 years 10 months ago
Orthogonal Least Square with Boosting for Regression
A novel technique is presented to construct sparse regression models based on the orthogonal least square method with boosting. This technique tunes the mean vector and diagonal c...
Sheng Chen, Xunxian Wang, David J. Brown
NIPS
2001
13 years 6 months ago
On the Convergence of Leveraging
We give an unified convergence analysis of ensemble learning methods including e.g. AdaBoost, Logistic Regression and the Least-SquareBoost algorithm for regression. These methods...
Gunnar Rätsch, Sebastian Mika, Manfred K. War...
ICML
2004
IEEE
14 years 6 months ago
Surrogate maximization/minimization algorithms for AdaBoost and the logistic regression model
Surrogate maximization (or minimization) (SM) algorithms are a family of algorithms that can be regarded as a generalization of expectation-maximization (EM) algorithms. There are...
Zhihua Zhang, James T. Kwok, Dit-Yan Yeung
CSDA
2007
120views more  CSDA 2007»
13 years 5 months ago
Boosting ridge regression
Ridge regression is a well established method to shrink regression parameters towards zero, thereby securing existence of estimates. The present paper investigates several approac...
Gerhard Tutz, Harald Binder