Sciweavers

ICML
2007
IEEE

Supervised feature selection via dependence estimation

14 years 5 months ago
Supervised feature selection via dependence estimation
We introduce a framework for filtering features that employs the Hilbert-Schmidt Independence Criterion (HSIC) as a measure of dependence between the features and the labels. The key idea is that good features should maximise such dependence. Feature selection for various supervised learning problems (including classification and regression) is unified under this framework, and the solutions can be approximated using a backward-elimination algorithm. We demonstrate the usefulness of our method on both artificial and real world datasets.
Le Song, Alex J. Smola, Arthur Gretton, Karsten M.
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2007
Where ICML
Authors Le Song, Alex J. Smola, Arthur Gretton, Karsten M. Borgwardt, Justin Bedo
Comments (0)