Sciweavers

PKDD
2009
Springer

Margin and Radius Based Multiple Kernel Learning

13 years 11 months ago
Margin and Radius Based Multiple Kernel Learning
A serious drawback of kernel methods, and Support Vector Machines (SVM) in particular, is the difficulty in choosing a suitable kernel function for a given dataset. One of the approaches proposed to address this problem is Multiple Kernel Learning (MKL) in which several kernels are combined adaptively for a given dataset. Many of the existing MKL methods use the SVM objective function and try to find a linear combination of basic kernels such that the separating margin between the classes is maximized. However, these methods ignore the fact that the theoretical error bound depends not only on the margin, but also on the radius of the smallest sphere that contains all the training instances. We present a novel MKL algorithm that optimizes the error bound taking account of both the margin and the radius. The empirical results show that the proposed method compares favorably with other state-of-the-art MKL methods. Key words: Learning Kernel Combination, Support Vector Machines, convex o...
Huyen Do, Alexandros Kalousis, Adam Woznica, Melan
Added 27 May 2010
Updated 27 May 2010
Type Conference
Year 2009
Where PKDD
Authors Huyen Do, Alexandros Kalousis, Adam Woznica, Melanie Hilario
Comments (0)