Sciweavers

Share
IJCNN
2006
IEEE

Greedy forward selection algorithms to Sparse Gaussian Process Regression

8 years 9 months ago
Greedy forward selection algorithms to Sparse Gaussian Process Regression
Abstract— This paper considers the basis vector selection issue invloved in forward selection algorithms to sparse Gaussian Process Regression (GPR). Firstly, we re-examine a previous basis vector selection criterion proposed by Smola and Bartlett [20], referred as loss-smola and give some new formulae to implement this criterion for the full-greedy strategy more efficiently in O(n2 kmax) time instead of the original O(n2 k2 max), where n is the number of training examples and kmax n is the maximally allowed number of selected basis vectors. Secondly, in order to make the algorithm linearly scaling in n, which is quite preferable for large datasets, we present an approximate version loss-sun to loss-smola criterion. We compare the full greedy algorithms induced by the loss-sun and loss-smola criteria, respectively, on several medium-scale datasets. In contrast to loss-smola, the advantage associated with loss-sun criterion is that it could lead to an algorithm which scales as O(nk2 ...
Ping Sun, Xin Yao
Added 11 Jun 2010
Updated 11 Jun 2010
Type Conference
Year 2006
Where IJCNN
Authors Ping Sun, Xin Yao
Comments (0)
books