Sciweavers

522 search results - page 7 / 105
» Feature selection for ranking using boosted trees
Sort
View
ESANN
2006
14 years 11 months ago
Random Forests Feature Selection with K-PLS: Detecting Ischemia from Magnetocardiograms
Random Forests were introduced by Breiman for feature (variable) selection and improved predictions for decision tree models. The resulting model is often superior to AdaBoost and ...
Long Han, Mark J. Embrechts, Boleslaw K. Szymanski...
113
Voted
ICML
2001
IEEE
15 years 10 months ago
Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection
In this paper, we examine the advantages and disadvantages of filter and wrapper methods for feature selection and propose a new hybrid algorithm that uses boosting and incorporat...
Sanmay Das
IR
2010
14 years 8 months ago
Adapting boosting for information retrieval measures
Abstract We present a new ranking algorithm that combines the strengths of two previous methods: boosted tree classification, and LambdaRank, which has been shown to be empiricall...
Qiang Wu, Christopher J. C. Burges, Krysta Marie S...
CVPR
2006
IEEE
15 years 11 months ago
Joint Boosting Feature Selection for Robust Face Recognition
A fundamental challenge in face recognition lies in determining what facial features are important for the identification of faces. In this paper, a novel face recognition framewo...
Rong Xiao, Wu-Jun Li, Yuandong Tian, Xiaoou Tang
SEBD
2008
177views Database» more  SEBD 2008»
14 years 11 months ago
Using PageRank in Feature Selection
Abstract. Feature selection is an important task in data mining because it allows to reduce the data dimensionality and eliminates the noisy variables. Traditionally, feature selec...
Dino Ienco, Rosa Meo, Marco Botta