Sciweavers

Share
PAMI
2010

Maximum Likelihood Model Selection for 1-Norm Soft Margin SVMs with Multiple Parameters

8 years 8 months ago
Maximum Likelihood Model Selection for 1-Norm Soft Margin SVMs with Multiple Parameters
—Adapting the hyperparameters of support vector machines (SVMs) is a challenging model selection problem, especially when flexible kernels are to be adapted and data are scarce. We present a coherent framework for regularized model selection of 1norm soft margin SVMs for binary classification. It is proposed to use gradient-ascent on a likelihood function of the hyperparameters. The likelihood function is based on logistic regression for robustly estimating the class conditional probabilities and can be computed efficiently. Overfitting is an important issue in SVM model selection and can be addressed in our framework by incorporating suitable prior distributions over the hyperparameters. We show empirically that gradient-based optimization of the likelihood function is able to adapt multiple kernel parameters and leads to better models than four concurrent state-of-the-art methods.
Tobias Glasmachers, Christian Igel
Added 29 Jan 2011
Updated 29 Jan 2011
Type Journal
Year 2010
Where PAMI
Authors Tobias Glasmachers, Christian Igel
Comments (0)
books