Sciweavers

313 search results - page 13 / 63
» Boosting with Diverse Base Classifiers
Sort
View
IJSI
2008
156views more  IJSI 2008»
14 years 9 months ago
Co-Training by Committee: A Generalized Framework for Semi-Supervised Learning with Committees
Many data mining applications have a large amount of data but labeling data is often difficult, expensive, or time consuming, as it requires human experts for annotation. Semi-supe...
Mohamed Farouk Abdel Hady, Friedhelm Schwenker
MCS
2007
Springer
15 years 3 months ago
Random Feature Subset Selection for Ensemble Based Classification of Data with Missing Features
Abstract. We report on our recent progress in developing an ensemble of classifiers based algorithm for addressing the missing feature problem. Inspired in part by the random subsp...
Joseph DePasquale, Robi Polikar
CEC
2008
IEEE
15 years 3 months ago
NichingEDA: Utilizing the diversity inside a population of EDAs for continuous optimization
— Since the Estimation of Distribution Algorithms (EDAs) have been introduced, several single model based EDAs and mixture model based EDAs have been developed. Take Gaussian mod...
Weishan Dong, Xin Yao
CVPR
2008
IEEE
15 years 11 months ago
Taylor expansion based classifier adaptation: Application to person detection
Because of the large variation across different environments, a generic classifier trained on extensive data-sets may perform sub-optimally in a particular test environment. In th...
Cha Zhang, Raffay Hamid, Zhengyou Zhang
98
Voted
CIDM
2009
IEEE
15 years 4 months ago
An empirical study of bagging and boosting ensembles for identifying faulty classes in object-oriented software
—  Identifying faulty classes in object-oriented software is one of the important software quality assurance activities. This paper empirically investigates the application of t...
Hamoud I. Aljamaan, Mahmoud O. Elish