Sciweavers

55 search results - page 1 / 11
» On Relevant Dimensions in Kernel Feature Spaces
Sort
View
JMLR
2008
131views more  JMLR 2008»
13 years 4 months ago
On Relevant Dimensions in Kernel Feature Spaces
We show that the relevant information of a supervised learning problem is contained up to negligible error in a finite number of leading kernel PCA components if the kernel matche...
Mikio L. Braun, Joachim M. Buhmann, Klaus-Robert M...
ICML
2007
IEEE
14 years 5 months ago
Regression on manifolds using kernel dimension reduction
We study the problem of discovering a manifold that best preserves information relevant to a nonlinear regression. Solving this problem involves extending and uniting two threads ...
Jens Nilsson, Fei Sha, Michael I. Jordan
JMLR
2010
155views more  JMLR 2010»
13 years 2 months ago
Approximate Tree Kernels
Convolution kernels for trees provide simple means for learning with tree-structured data. The computation time of tree kernels is quadratic in the size of the trees, since all pa...
Konrad Rieck, Tammo Krueger, Ulf Brefeld, Klaus-Ro...
ICIP
2003
IEEE
14 years 6 months ago
Kernel indexing for relevance feedback image retrieval
Relevance feedback is an attractive approach to developing flexible metrics for content-based retrieval in image and video databases. Large image databases require an index struct...
Jing Peng, Douglas R. Heisterkamp
ICML
2010
IEEE
13 years 5 months ago
Projection Penalties: Dimension Reduction without Loss
Dimension reduction is popular for learning predictive models in high-dimensional spaces. It can highlight the relevant part of the feature space and avoid the curse of dimensiona...
Yi Zhang 0010, Jeff Schneider