Sciweavers

454 search results - page 1 / 91
» Principal Component Analysis Based on L1-Norm Maximization
Sort
View
PAMI
2008
200views more  PAMI 2008»
13 years 5 months ago
Principal Component Analysis Based on L1-Norm Maximization
In data-analysis problems with a large number of dimension, principal component analysis based on L2-norm (L2PCA) is one of the most popular methods, but L2-PCA is sensitive to out...
Nojun Kwak
ICML
2006
IEEE
14 years 6 months ago
R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization
Principal component analysis (PCA) minimizes the sum of squared errors (L2-norm) and is sensitive to the presence of outliers. We propose a rotational invariant L1-norm PCA (R1-PC...
Chris H. Q. Ding, Ding Zhou, Xiaofeng He, Hongyuan...
AAAI
2008
13 years 7 months ago
Sparse Projections over Graph
Recent study has shown that canonical algorithms such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) can be obtained from graph based dimensionality ...
Deng Cai, Xiaofei He, Jiawei Han
COMPLIFE
2006
Springer
13 years 9 months ago
Set-Oriented Dimension Reduction: Localizing Principal Component Analysis Via Hidden Markov Models
We present a method for simultaneous dimension reduction and metastability analysis of high dimensional time series. The approach is based on the combination of hidden Markov model...
Illia Horenko, Johannes Schmidt-Ehrenberg, Christo...
ICDM
2007
IEEE
159views Data Mining» more  ICDM 2007»
13 years 9 months ago
Spectral Regression: A Unified Approach for Sparse Subspace Learning
Recently the problem of dimensionality reduction (or, subspace learning) has received a lot of interests in many fields of information processing, including data mining, informati...
Deng Cai, Xiaofei He, Jiawei Han