Sciweavers

30 search results - page 2 / 6
» Boosting Technique for Combining Cellular GP Classifiers
Sort
View
PAKDD
2000
ACM
161views Data Mining» more  PAKDD 2000»
13 years 9 months ago
Adaptive Boosting for Spatial Functions with Unstable Driving Attributes
Combining multiple global models (e.g. back-propagation based neural networks) is an effective technique for improving classification accuracy by reducing a variance through manipu...
Aleksandar Lazarevic, Tim Fiez, Zoran Obradovic
IJCAI
2003
13 years 7 months ago
Constructing Diverse Classifier Ensembles using Artificial Training Examples
Ensemble methods like bagging and boosting that combine the decisions of multiple hypotheses are some of the strongest existing machine learning methods. The diversity of the memb...
Prem Melville, Raymond J. Mooney
KDD
2001
ACM
216views Data Mining» more  KDD 2001»
14 years 6 months ago
The distributed boosting algorithm
In this paper, we propose a general framework for distributed boosting intended for efficient integrating specialized classifiers learned over very large and distributed homogeneo...
Aleksandar Lazarevic, Zoran Obradovic
FLAIRS
2006
13 years 7 months ago
Using Validation Sets to Avoid Overfitting in AdaBoost
AdaBoost is a well known, effective technique for increasing the accuracy of learning algorithms. However, it has the potential to overfit the training set because its objective i...
Tom Bylander, Lisa Tate
ICANN
2007
Springer
14 years 13 days ago
Boosting Unsupervised Competitive Learning Ensembles
Topology preserving mappings are great tools for data visualization and inspection in large datasets. This research presents a combination of several topology preserving mapping mo...
Emilio Corchado, Bruno Baruque, Hujun Yin