Sciweavers

325 search results - page 2 / 65
» Quadratic Programming Feature Selection
Sort
View
AAAI
2011
12 years 5 months ago
Size Adaptive Selection of Most Informative Features
In this paper, we propose a novel method to select the most informative subset of features, which has little redundancy and very strong discriminating power. Our proposed approach...
Si Liu, Hairong Liu, Longin Jan Latecki, Shuicheng...
NIPS
2003
13 years 6 months ago
Fast Feature Selection from Microarray Expression Data via Multiplicative Large Margin Algorithms
New feature selection algorithms for linear threshold functions are described which combine backward elimination with an adaptive regularization method. This makes them particular...
Claudio Gentile
ICDM
2008
IEEE
160views Data Mining» more  ICDM 2008»
13 years 11 months ago
Direct Zero-Norm Optimization for Feature Selection
Zero-norm, defined as the number of non-zero elements in a vector, is an ideal quantity for feature selection. However, minimization of zero-norm is generally regarded as a combi...
Kaizhu Huang, Irwin King, Michael R. Lyu
APPML
2007
91views more  APPML 2007»
13 years 5 months ago
Steplength selection in interior-point methods for quadratic programming
We present a new strategy for choosing primal and dual steplengths in a primal-dual interior-point algorithm for convex quadratic programming. Current implementations often scale ...
Frank E. Curtis, Jorge Nocedal
CASES
2010
ACM
13 years 3 months ago
Instruction selection by graph transformation
Common generated instruction selections are based on tree pattern matching, but modern and custom architectures feature instructions, which cannot be covered by trees. To overcome...
Sebastian Buchwald, Andreas Zwinkau