Sciweavers

77 search results - page 1 / 16
» Gradient LASSO for feature selection
Sort
View
ICML
2004
IEEE
13 years 10 months ago
Gradient LASSO for feature selection
LASSO (Least Absolute Shrinkage and Selection Operator) is a useful tool to achieve the shrinkage and variable selection simultaneously. Since LASSO uses the L1 penalty, the optim...
Yongdai Kim, Jinseog Kim
CORR
2010
Springer
139views Education» more  CORR 2010»
13 years 4 months ago
Fast Overlapping Group Lasso
The group Lasso is an extension of the Lasso for feature selection on (predefined) non-overlapping groups of features. The non-overlapping group structure limits its applicability...
Jun Liu, Jieping Ye
JMLR
2010
133views more  JMLR 2010»
12 years 11 months ago
Exclusive Lasso for Multi-task Feature Selection
We propose a novel group regularization which we call exclusive lasso. Unlike the group lasso regularizer that assumes covarying variables in groups, the proposed exclusive lasso ...
Yang Zhou, Rong Jin, Steven C. H. Hoi
JMLR
2006
103views more  JMLR 2006»
13 years 4 months ago
On Model Selection Consistency of Lasso
Sparsity or parsimony of statistical models is crucial for their proper interpretations, as in sciences and social sciences. Model selection is a commonly used method to find such...
Peng Zhao, Bin Yu
JMLR
2010
104views more  JMLR 2010»
12 years 11 months ago
Increasing Feature Selection Accuracy for L1 Regularized Linear Models
L1 (also referred to as the 1-norm or Lasso) penalty based formulations have been shown to be effective in problem domains when noisy features are present. However, the L1 penalty...
Abhishek Jaiantilal, Gregory Z. Grudic