Sciweavers

GECCO
2007
Springer

Controlling overfitting with multi-objective support vector machines

13 years 7 months ago
Controlling overfitting with multi-objective support vector machines
Recently, evolutionary computation has been successfully integrated into statistical learning methods. A Support Vector Machine (SVM) using evolution strategies for its optimization problem frequently deliver better results with respect to the optimization criterion and the prediction accuracy. Moreover, evolutionary computation allows for the efficient large margin optimization of a huge family of new kernel functions, namely non-positive semidefinite kernels as the Epanechnikov kernel. For these kernel functions, evolutionary SVM even outperform other learning methods like the Relevance Vector Machine. In this paper, we will discuss another major advantage of evolutionary SVM compared to traditional SVM solutions: we can explicitly optimize the inherent trade-off between training error and model complexity by embedding multi-objective optimization into the evolutionary SVM. This leads to three advantages: first, it is no longer necessary to tune the SVM parameter C which weighs both...
Ingo Mierswa
Added 16 Aug 2010
Updated 16 Aug 2010
Type Conference
Year 2007
Where GECCO
Authors Ingo Mierswa
Comments (0)