Free Online Productivity Tools
i2Speak
i2Symbol
i2OCR
iTex2Img
iWeb2Print
iWeb2Shot
i2Type
iPdf2Split
iPdf2Merge
i2Bopomofo
i2Arabic
i2Style
i2Image
i2PDF
iLatex2Rtf
Sci2ools

IJCAI

2003

2003

We equate nonlinear dimensionality reduction (NLDR) to graph embedding with side information about the vertices, and derive a solution to either problem in the form of a kernel-based mixture of affine maps from the ambient space to the target space. Unlike most spectral NLDR methods, the central eigenproblem can be made relatively small, and the result is a continuous mapping defined over the entire space, not just the datapoints. A demon stration is made to visualizing the distribution of word usages (as a proxy to word meanings) in a sample of the machine learning literature. 1 Background: Graph embcddings Consider a connected graph with weighted undirected edges specified by edge matrix W. Let be the posi tive edge weight between connected vertices i and j zero otherwise. Let D = diag(Wl) be a diagonal matrix where the cumulative edge weights into vertex /. The following points are well known or easily derived in spectral graph theory [Fiedler, 1975; Chung, 1997]:

Related Content

Added |
31 Oct 2010 |

Updated |
31 Oct 2010 |

Type |
Conference |

Year |
2003 |

Where |
IJCAI |

Authors |
Matthew Brand |

Comments (0)