Sciweavers

89 search results - page 1 / 18
» Using Validation Sets to Avoid Overfitting in AdaBoost
Sort
View
FLAIRS
2006
13 years 6 months ago
Using Validation Sets to Avoid Overfitting in AdaBoost
AdaBoost is a well known, effective technique for increasing the accuracy of learning algorithms. However, it has the potential to overfit the training set because its objective i...
Tom Bylander, Lisa Tate
ECML
2007
Springer
13 years 11 months ago
Avoiding Boosting Overfitting by Removing Confusing Samples
Boosting methods are known to exhibit noticeable overfitting on some datasets, while being immune to overfitting on other ones. In this paper we show that standard boosting algorit...
Alexander Vezhnevets, Olga Barinova
PRL
2008
124views more  PRL 2008»
13 years 4 months ago
Matrix-pattern-oriented least squares support vector classifier with AdaBoost
: Matrix-pattern-oriented Least Squares Support Vector Classifier (MatLSSVC) can directly classify matrix patterns and has a superior classification performance than its vector ver...
Zhe Wang, Songcan Chen
APIN
2010
108views more  APIN 2010»
13 years 5 months ago
A low variance error boosting algorithm
Abstract. This paper introduces a robust variant of AdaBoost, cwAdaBoost, that uses weight perturbation to reduce variance error, and is particularly effective when dealing with da...
Ching-Wei Wang, Andrew Hunter
ICML
2005
IEEE
14 years 5 months ago
A smoothed boosting algorithm using probabilistic output codes
AdaBoost.OC has shown to be an effective method in boosting "weak" binary classifiers for multi-class learning. It employs the Error Correcting Output Code (ECOC) method...
Rong Jin, Jian Zhang