Sciweavers

86 search results - page 3 / 18
» Bagging, Boosting, and C4.5
Sort
View
PAA
2002
13 years 5 months ago
Bagging, Boosting and the Random Subspace Method for Linear Classifiers
: Recently bagging, boosting and the random subspace method have become popular combining techniques for improving weak classifiers. These techniques are designed for, and usually ...
Marina Skurichina, Robert P. W. Duin
ANLP
2000
111views more  ANLP 2000»
13 years 7 months ago
Bagging and Boosting a Treebank Parser
Bagging and boosting, two effective machine learning techniques, are applied to natural language parsing. Experiments using these techniques with a trainable statistical parser ar...
John C. Henderson, Eric Brill
KDD
2005
ACM
103views Data Mining» more  KDD 2005»
14 years 6 months ago
Robust boosting and its relation to bagging
Several authors have suggested viewing boosting as a gradient descent search for a good fit in function space. At each iteration observations are re-weighted using the gradient of...
Saharon Rosset
TSMC
2008
164views more  TSMC 2008»
13 years 5 months ago
Bagging and Boosting Negatively Correlated Neural Networks
In this paper, we propose two cooperative ensemble learning algorithms, i.e., NegBagg and NegBoost, for designing neural network (NN) ensembles. The proposed algorithms incremental...
Md. Monirul Islam, Xin Yao, S. M. Shahriar Nirjon,...
IJCAI
2003
13 years 7 months ago
Monte Carlo Theory as an Explanation of Bagging and Boosting
In this paper we propose the framework of Monte Carlo algorithms as a useful one to analyze ensemble learning. In particular, this framework allows one to guess when bagging will ...
Roberto Esposito, Lorenza Saitta