Sciweavers

27 search results - page 4 / 6
» Kernel Partial Least Squares is Universally Consistent
Sort
View
ADCM
2008
136views more  ADCM 2008»
14 years 9 months ago
Learning and approximation by Gaussians on Riemannian manifolds
Learning function relations or understanding structures of data lying in manifolds embedded in huge dimensional Euclidean spaces is an important topic in learning theory. In this ...
Gui-Bo Ye, Ding-Xuan Zhou
ICML
2006
IEEE
15 years 10 months ago
Kernel Predictive Linear Gaussian models for nonlinear stochastic dynamical systems
The recent Predictive Linear Gaussian model (or PLG) improves upon traditional linear dynamical system models by using a predictive representation of state, which makes consistent...
David Wingate, Satinder P. Singh
ICML
2003
IEEE
15 years 10 months ago
Kernel PLS-SVC for Linear and Nonlinear Classification
A new method for classification is proposed. This is based on kernel orthonormalized partial least squares (PLS) dimensionality reduction of the original data space followed by a ...
Roman Rosipal, Leonard J. Trejo, Bryan Matthews
ICML
2007
IEEE
15 years 10 months ago
Kernelizing PLS, degrees of freedom, and efficient model selection
Kernelizing partial least squares (PLS), an algorithm which has been particularly popular in chemometrics, leads to kernel PLS which has several interesting properties, including ...
Mikio L. Braun, Nicole Krämer
ICML
2007
IEEE
15 years 10 months ago
Dimensionality reduction and generalization
In this paper we investigate the regularization property of Kernel Principal Component Analysis (KPCA), by studying its application as a preprocessing step to supervised learning ...
Sofia Mosci, Lorenzo Rosasco, Alessandro Verri