Sciweavers

FOCM
2006

Learning Rates of Least-Square Regularized Regression

13 years 4 months ago
Learning Rates of Least-Square Regularized Regression
This paper considers the regularized learning algorithm associated with the leastsquare loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is C and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is mwith arbitrarily close to 1, regardless of the variance of the bounded probability distribution. Short Title: Least-square Regularized Regression Keywords and Phrases: learning theory, reproducing kernel Hilbert space, regularization error, covering number, regularization scheme. AMS Subject Classification Numbers: 68T05, 62J02.
Qiang Wu, Yiming Ying, Ding-Xuan Zhou
Added 12 Dec 2010
Updated 12 Dec 2010
Type Journal
Year 2006
Where FOCM
Authors Qiang Wu, Yiming Ying, Ding-Xuan Zhou
Comments (0)