Sciweavers

IEICET
2007

Analytic Optimization of Adaptive Ridge Parameters Based on Regularized Subspace Information Criterion

13 years 4 months ago
Analytic Optimization of Adaptive Ridge Parameters Based on Regularized Subspace Information Criterion
In order to obtain better learning results in supervised learning, it is important to choose model parameters appropriately. Model selection is usually carried out by preparing a finite set of model candidates, estimating a generalization error for each candidate, and choosing the best one from the candidates. If the number of candidates is increased in this procedure, the optimization quality may be improved. However, this in turn increases the computational cost. In this paper, we focus on a generalization error estimator called the regularized subspace information criterion and derive an analytic form of the optimal model parameter over a set of infinitely many model candidates. This allows us to maximize the optimization quality while the computational cost is kept moderate. Keywords supervised learning, generalization error, model selection, regularized subspace information criterion
Shun Gokita, Masashi Sugiyama, Keisuke Sakurai
Added 14 Dec 2010
Updated 14 Dec 2010
Type Journal
Year 2007
Where IEICET
Authors Shun Gokita, Masashi Sugiyama, Keisuke Sakurai
Comments (0)