Sciweavers

FLAIRS
2006
13 years 6 months ago
Using Validation Sets to Avoid Overfitting in AdaBoost
AdaBoost is a well known, effective technique for increasing the accuracy of learning algorithms. However, it has the potential to overfit the training set because its objective i...
Tom Bylander, Lisa Tate
IJCAI
2007
13 years 6 months ago
Managing Domain Knowledge and Multiple Models with Boosting
We present MBoost, a novel extension to AdaBoost that extends boosting to use multiple weak learners explicitly, and provides robustness to learning models that overfit or are po...
Peng Zang, Charles Lee Isbell Jr.
ALT
2010
Springer
13 years 6 months ago
Approximation Stability and Boosting
Stability has been explored to study the performance of learning algorithms in recent years and it has been shown that stability is sufficient for generalization and is sufficient ...
Wei Gao, Zhi-Hua Zhou
AVBPA
2003
Springer
133views Biometrics» more  AVBPA 2003»
13 years 8 months ago
LUT-Based Adaboost for Gender Classification
There are two main approaches to the problem of gender classification, Support Vector Machines (SVMs) and Adaboost learning methods, of which SVMs are better in correct rate but ar...
Bo Wu, Haizhou Ai, Chang Huang
SIGIR
1998
ACM
13 years 8 months ago
Boosting and Rocchio Applied to Text Filtering
We discuss two learning algorithms for text filtering: modified Rocchio and a boosting algorithm called AdaBoost. We show how both algorithms can be adapted to maximize any gene...
Robert E. Schapire, Yoram Singer, Amit Singhal
KDD
1999
ACM
199views Data Mining» more  KDD 1999»
13 years 8 months ago
The Application of AdaBoost for Distributed, Scalable and On-Line Learning
We propose to use AdaBoost to efficiently learn classifiers over very large and possibly distributed data sets that cannot fit into main memory, as well as on-line learning wher...
Wei Fan, Salvatore J. Stolfo, Junxin Zhang
COLT
1999
Springer
13 years 8 months ago
Boosting as Entropy Projection
We consider the AdaBoost procedure for boosting weak learners. In AdaBoost, a key step is choosing a new distribution on the training examples based on the old distribution and th...
Jyrki Kivinen, Manfred K. Warmuth
IWBRS
2005
Springer
168views Biometrics» more  IWBRS 2005»
13 years 10 months ago
Gabor Feature Selection for Face Recognition Using Improved AdaBoost Learning
Though AdaBoost has been widely used for feature selection and classifier learning, many of the selected features, or weak classifiers, are redundant. By incorporating mutual infor...
LinLin Shen, Li Bai, Daniel Bardsley, Yangsheng Wa...
ICCV
2009
IEEE
13 years 10 months ago
Joint Pose Estimator and Feature Learning for Object Detection
A new learning strategy for object detection is presented. The proposed scheme forgoes the need to train a collection of detectors dedicated to homogeneous families of poses, an...
Karim Ali, Francois Fleuret, David Hasler and Pasc...
IJCNN
2007
IEEE
13 years 11 months ago
Two-stage Multi-class AdaBoost for Facial Expression Recognition
— Although AdaBoost has achieved great success, it still suffers from following problems: (1) the training process could be unmanageable when the number of features is extremely ...
Hongbo Deng, Jianke Zhu, Michael R. Lyu, Irwin Kin...