Sciweavers

153 search results - page 1 / 31
» Supervised feature selection via dependence estimation
Sort
View
ICML
2007
IEEE
14 years 5 months ago
Supervised feature selection via dependence estimation
We introduce a framework for filtering features that employs the Hilbert-Schmidt Independence Criterion (HSIC) as a measure of dependence between the features and the labels. The ...
Le Song, Alex J. Smola, Arthur Gretton, Karsten M....
JMLR
2010
108views more  JMLR 2010»
12 years 11 months ago
Sufficient Dimension Reduction via Squared-loss Mutual Information Estimation
The goal of sufficient dimension reduction in supervised learning is to find the lowdimensional subspace of input features that is `sufficient' for predicting output values. ...
Taiji Suzuki, Masashi Sugiyama
ICML
2009
IEEE
14 years 5 months ago
Partially supervised feature selection with regularized linear models
This paper addresses feature selection techniques for classification of high dimensional data, such as those produced by microarray experiments. Some prior knowledge may be availa...
Thibault Helleputte, Pierre Dupont
FUIN
2010
114views more  FUIN 2010»
12 years 11 months ago
Feature Selection via Maximizing Fuzzy Dependency
Feature selection is an important preprocessing step in pattern analysis and machine learning. The key issue in feature selection is to evaluate quality of candidate features. In t...
Qinghua Hu, Pengfei Zhu, Jinfu Liu, Yongbin Yang, ...
PKDD
2009
Springer
148views Data Mining» more  PKDD 2009»
13 years 10 months ago
Feature Selection by Transfer Learning with Linear Regularized Models
Abstract. This paper presents a novel feature selection method for classification of high dimensional data, such as those produced by microarrays. It includes a partial supervisio...
Thibault Helleputte, Pierre Dupont