Sciweavers

Share
IWBRS
2005
Springer

Gabor Feature Selection for Face Recognition Using Improved AdaBoost Learning

8 years 11 months ago
Gabor Feature Selection for Face Recognition Using Improved AdaBoost Learning
Though AdaBoost has been widely used for feature selection and classifier learning, many of the selected features, or weak classifiers, are redundant. By incorporating mutual information into AdaBoost, we propose an improved boosting algorithm in this paper. The proposed method fully examines the redundancy between candidate classifiers and selected classifiers. The classifiers thus selected are both accurate and non-redundant. Experimental results show that the strong classifier learned using the proposed algorithm achieves a lower training error rate than AdaBoost. The proposed algorithm has also been applied to select discriminative Gabor features for face recognition. Even with the simple correlation distance measure and 1-NN classifier, the selected Gabor features achieve quite high recognition accuracy on the FERET database, where both expression and illumination variance exists. When only 140 features are used, the selected features achieve as high as 95.5% accuracy, which is ab...
LinLin Shen, Li Bai, Daniel Bardsley, Yangsheng Wa
Added 28 Jun 2010
Updated 28 Jun 2010
Type Conference
Year 2005
Where IWBRS
Authors LinLin Shen, Li Bai, Daniel Bardsley, Yangsheng Wang
Comments (0)
books