Sciweavers

CORR
2008
Springer

Robustness, Risk, and Regularization in Support Vector Machines

13 years 4 months ago
Robustness, Risk, and Regularization in Support Vector Machines
We consider two new formulations for classification problems in the spirit of support vector machines based on robust optimization. Our formulations are designed to build in protection to noise and control overfitting, but without being overly conservative. Our first formulation allows the noise between different samples to be correlated. We show that the standard norm-regularized support vector machine classifier is a solution to a special case of our first formulation, thus providing an explicit link between regularization and robustness in pattern classification. Our second formulation is based on a softer version of robust optimization called comprehensive robustness. We show that this formulation is equivalent to regularization by any arbitrary convex regularizer, thus extending our first equivalence result. Moreover, we explain how the connection of comprehensive robustness to convex risk-measures can be used to design risk-measure constrained classifiers with robustness to the ...
Huan Xu, Shie Mannor, Constantine Caramanis
Added 09 Dec 2010
Updated 09 Dec 2010
Type Journal
Year 2008
Where CORR
Authors Huan Xu, Shie Mannor, Constantine Caramanis
Comments (0)