Sciweavers

ICML
2004
IEEE

Learning a kernel matrix for nonlinear dimensionality reduction

13 years 10 months ago
Learning a kernel matrix for nonlinear dimensionality reduction
We investigate how to learn a kernel matrix for high dimensional data that lies on or near a low dimensional manifold. Noting that the kernel matrix implicitly maps the data into a nonlinear feature space, we show how to discover a mapping that “unfolds” the underlying manifold from which the data was sampled. The kernel matrix is constructed by maximizing the variance in feature space subject to local constraints that preserve the angles and distances between nearest neighbors. The main optimization involves an instance of semidefinite programming—a fundamentally different computation than previous algorithms for manifold learning, such as Isomap and locally linear embedding. The optimized kernels perform better than polynomial and Gaussian kernels for problems in manifold learning, but worse for problems in large margin classification. We explain these results in terms of the geometric properties of different kernels and comment on various interpretations of other manifold...
Kilian Q. Weinberger, Fei Sha, Lawrence K. Saul
Added 30 Jun 2010
Updated 30 Jun 2010
Type Conference
Year 2004
Where ICML
Authors Kilian Q. Weinberger, Fei Sha, Lawrence K. Saul
Comments (0)