Sciweavers

1707 search results - page 83 / 342
» A Boosting Algorithm for Regression
Sort
View
ISCI
2008
88views more  ISCI 2008»
15 years 4 months ago
Greedy regression ensemble selection: Theory and an application to water quality prediction
This paper studies the greedy ensemble selection family of algorithms for ensembles of regression models. These algorithms search for the globally best subset of regresmaking loca...
Ioannis Partalas, Grigorios Tsoumakas, Evaggelos V...
ICML
2006
IEEE
16 years 5 months ago
Efficient co-regularised least squares regression
In many applications, unlabelled examples are inexpensive and easy to obtain. Semisupervised approaches try to utilise such examples to reduce the predictive error. In this paper,...
Stefan Wrobel, Thomas Gärtner, Tobias Scheffe...
ISMIS
2000
Springer
15 years 8 months ago
Prediction of Ordinal Classes Using Regression Trees
This paper is devoted to the problem of learning to predict ordinal (i.e., ordered discrete) classes using classification and regression trees. We start with S-CART, a tree inducti...
Stefan Kramer, Gerhard Widmer, Bernhard Pfahringer...
IJCAI
2003
15 years 5 months ago
Monte Carlo Theory as an Explanation of Bagging and Boosting
In this paper we propose the framework of Monte Carlo algorithms as a useful one to analyze ensemble learning. In particular, this framework allows one to guess when bagging will ...
Roberto Esposito, Lorenza Saitta
SODA
2001
ACM
94views Algorithms» more  SODA 2001»
15 years 5 months ago
Computing the depth of a flat
We compute the regression depth of a k-flat in a set of n points in Rd, in time O(nd-2 + n log n) for 1 k d - 2. This contrasts with a bound of O(nd-1 + n log n) when k = 0 or
Marshall W. Bern