Sciweavers

KDD
2010
ACM

A scalable two-stage approach for a class of dimensionality reduction techniques

13 years 5 months ago
A scalable two-stage approach for a class of dimensionality reduction techniques
Dimensionality reduction plays an important role in many data mining applications involving high-dimensional data. Many existing dimensionality reduction techniques can be formulated as a generalized eigenvalue problem, which does not scale to large-size problems. Prior work transforms the generalized eigenvalue problem into an equivalent least squares formulation, which can then be solved efficiently. However, the equivalence relationship only holds under certain assumptions without regularization, which severely limits their applicability in practice. In this paper, an efficient two-stage approach is proposed to solve a class of dimensionality reduction techniques, including Canonical Correlation Analysis, Orthonormal Partial Least Squares, Linear Discriminant Analysis, and Hypergraph Spectral Learning. The proposed two-stage approach scales linearly in terms of both the sample size and data dimensionality. The main contributions of this paper include (1) we rigorously establish the...
Liang Sun, Betul Ceran, Jieping Ye
Added 12 Oct 2010
Updated 12 Oct 2010
Type Conference
Year 2010
Where KDD
Authors Liang Sun, Betul Ceran, Jieping Ye
Comments (0)