Sciweavers

IJCAI
2003

Continuous nonlinear dimensionality reduction by kernel Eigenmaps

13 years 5 months ago
Continuous nonlinear dimensionality reduction by kernel Eigenmaps
We equate nonlinear dimensionality reduction (NLDR) to graph embedding with side information about the vertices, and derive a solution to either problem in the form of a kernel-based mixture of affine maps from the ambient space to the target space. Unlike most spectral NLDR methods, the central eigenproblem can be made relatively small, and the result is a continuous mapping defined over the entire space, not just the datapoints. A demon­ stration is made to visualizing the distribution of word usages (as a proxy to word meanings) in a sample of the machine learning literature. 1 Background: Graph embcddings Consider a connected graph with weighted undirected edges specified by edge matrix W. Let be the posi­ tive edge weight between connected vertices i and j zero otherwise. Let D = diag(Wl) be a diagonal matrix where the cumulative edge weights into vertex /. The following points are well known or easily derived in spectral graph theory [Fiedler, 1975; Chung, 1997]:
Matthew Brand
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2003
Where IJCAI
Authors Matthew Brand
Comments (0)