Sciweavers

Share
ESANN
2003

Approximately unbiased estimation of conditional variance in heteroscedastic kernel ridge regression

8 years 4 months ago
Approximately unbiased estimation of conditional variance in heteroscedastic kernel ridge regression
In this paper we extend a form of kernel ridge regression for data characterised by a heteroscedastic noise process (introduced in Foxall et al. [1]) in order to provide approximately unbiased estimates of the conditional variance of the target distribution. This is achieved by the use of the leave-one-out cross-validation estimate of the conditional mean when fitting the model of the conditional variance. The elimination of this bias is demonstrated on synthetic dataset where the true conditional variance is known. It is well known that the minimisation of a sum-of-squares error (SSE) metric corresponds to maximum likelihood estimation of the parameters of a regression model, where the target data are assumed to be realisations of some deterministic process that have been corrupted by additive Gaussian noise with constant variance (i.e. a homoscedastic noise process) (e.g. Bishop [2]). Several kernel learning methods based on the minimisation of a regularised sum-ofsquares have been...
Gavin C. Cawley, Nicola L. C. Talbot, Robert J. Fo
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2003
Where ESANN
Authors Gavin C. Cawley, Nicola L. C. Talbot, Robert J. Foxall, Stephen R. Dorling, Danilo P. Mandic
Comments (0)
books