Sciweavers

54 search results - page 2 / 11
» A Comparison of the Bagging and the Boosting Methods Using t...
Sort
View
ICML
1999
IEEE
14 years 5 months ago
Lazy Bayesian Rules: A Lazy Semi-Naive Bayesian Learning Technique Competitive to Boosting Decision Trees
Lbr is a lazy semi-naive Bayesian classi er learning technique, designed to alleviate the attribute interdependence problem of naive Bayesian classi cation. To classify a test exa...
Zijian Zheng, Geoffrey I. Webb, Kai Ming Ting
ROCAI
2004
Springer
13 years 10 months ago
An Empirical Evaluation of Supervised Learning for ROC Area
We present an empirical comparison of the AUC performance of seven supervised learning methods: SVMs, neural nets, decision trees, k-nearest neighbor, bagged trees, boosted trees,...
Rich Caruana, Alexandru Niculescu-Mizil
IJCAI
2003
13 years 6 months ago
Constructing Diverse Classifier Ensembles using Artificial Training Examples
Ensemble methods like bagging and boosting that combine the decisions of multiple hypotheses are some of the strongest existing machine learning methods. The diversity of the memb...
Prem Melville, Raymond J. Mooney
PRL
2008
213views more  PRL 2008»
13 years 4 months ago
Boosting recombined weak classifiers
Boosting is a set of methods for the construction of classifier ensembles. The differential feature of these methods is that they allow to obtain a strong classifier from the comb...
Juan José Rodríguez, Jesús Ma...
CIDM
2009
IEEE
13 years 11 months ago
An empirical study of bagging and boosting ensembles for identifying faulty classes in object-oriented software
—  Identifying faulty classes in object-oriented software is one of the important software quality assurance activities. This paper empirically investigates the application of t...
Hamoud I. Aljamaan, Mahmoud O. Elish