Sciweavers

Share
IJCNN
2006
IEEE

Dynamic Hyperparameter Scaling Method for LVQ Algorithms

9 years 10 months ago
Dynamic Hyperparameter Scaling Method for LVQ Algorithms
— We propose a new annealing method for the hyperparameters of several recent Learning Vector Quantization algorithms. We first analyze the relationship between values assigned to the hyperparameters, the on-line learning process, and the structure of the resulting classifier. Motivated by the results we then suggest an annealing method, where each hyperparameter is initially set to a large value and is then slowly decreased during learning. We apply the annealing method to the LVQ 2.1, SLVQ-LR, and RSLVQ methods, and we compare the generalization performance achieved with the new annealing method and with a standard hyperparameter selection using 10-fold cross validation. Benchmark results are provided for the datasets letter and pendigits from the UCI Machine Learning Repository. The new selection method provides equally good or - for some data sets - even superior results when compared to standard selection methods. More importantly, however, the number of learning trials for di...
Sambu Seo, Klaus Obermayer
Added 11 Jun 2010
Updated 11 Jun 2010
Type Conference
Year 2006
Where IJCNN
Authors Sambu Seo, Klaus Obermayer
Comments (0)
books