Sciweavers

Share
AAAI
2006

Robust Support Vector Machine Training via Convex Outlier Ablation

11 years 6 months ago
Robust Support Vector Machine Training via Convex Outlier Ablation
One of the well known risks of large margin training methods, such as boosting and support vector machines (SVMs), is their sensitivity to outliers. These risks are normally mitigated by using a soft margin criterion, such as hinge loss, to reduce outlier sensitivity. In this paper, we present a more direct approach that explicitly incorporates outlier suppression in the training process. In particular, we show how outlier detection can be encoded in the large margin training principle of support vector machines. By expressing a convex relaxation of the joint training problem as a semidefinite program, one can use this approach to robustly train a support vector machine while suppressing outliers. We demonstrate that our approach can yield superior results to the standard soft margin approach in the presence of outliers.
Linli Xu, Koby Crammer, Dale Schuurmans
Added 30 Oct 2010
Updated 30 Oct 2010
Type Conference
Year 2006
Where AAAI
Authors Linli Xu, Koby Crammer, Dale Schuurmans
Comments (0)
books