Sciweavers

ICANN
2005
Springer

The LCCP for Optimizing Kernel Parameters for SVM

13 years 10 months ago
The LCCP for Optimizing Kernel Parameters for SVM
Abstract. Tuning hyper-parameters is a necessary step to improve learning algorithm performances. For Support Vector Machine classifiers, adjusting kernel parameters increases drastically the recognition accuracy. Basically, cross-validation is performed by sweeping exhaustively the parameter space. The complexity of such grid search is exponential with respect to the number of optimized parameters. Recently, a gradient descent approach has been introduced in [1] which reduces drastically the search steps of the optimal parameters. In this paper, we define the LCCP (Log Convex Concave Procedure) optimization scheme derived from the CCCP (Convex ConCave Procedure) for optimizing kernel parameters by minimizing the radius-margin bound. To apply the LCCP, we prove, for a particular choice of kernel, that the radius is log convex and the margin is log concave. The LCCP is more efficient than gradient descent technique since it insures that the radius margin bound decreases monotonically ...
Sabri Boughorbel, Jean-Philippe Tarel, Nozha Bouje
Added 27 Jun 2010
Updated 16 Aug 2010
Type Conference
Year 2005
Where ICANN
Authors Sabri Boughorbel, Jean-Philippe Tarel, Nozha Boujemaa
http://perso.lcpc.fr/tarel.jean-philippe/publis/icann05a.html
Comments (0)