Sciweavers

Share
ICCV
2007
IEEE

Spectral Regression for Efficient Regularized Subspace Learning

10 years 7 days ago
Spectral Regression for Efficient Regularized Subspace Learning
Subspace learning based face recognition methods have attracted considerable interests in recent years, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Locality Preserving Projection (LPP), Neighborhood Preserving Embedding (NPE) and Marginal Fisher Analysis (MFA). However, a disadvantage of all these approaches is that their computations involve eigendecomposition of dense matrices which is expensive in both time and memory. In this paper, we propose a novel dimensionality reduction framework, called Spectral Regression (SR), for efficient regularized subspace learning. SR casts the problem of learning the projective functions into a regression framework, which avoids eigen-decomposition of dense matrices. Also, with the regression based framework, different kinds of regularizers can be naturally incorporated into our algorithm which makes it more flexible. Computational analysis shows that SR has only linear-time complexity which is a huge speed up ...
Deng Cai, Xiaofei He, Jiawei Han
Added 14 Oct 2009
Updated 14 Oct 2009
Type Conference
Year 2007
Where ICCV
Authors Deng Cai, Xiaofei He, Jiawei Han
Comments (0)
books