Sciweavers

TIT
2002

On the generalization of soft margin algorithms

13 years 4 months ago
On the generalization of soft margin algorithms
Generalization bounds depending on the margin of a classifier are a relatively recent development. They provide an explanation of the performance of state-of-the-art learning systems such as support vector machines (SVMs) [1] and Adaboost [2]. The difficulty with these bounds has been either their lack of robustness or their looseness. The question of whether the generalization of a classifier can be more tightly bounded in terms of a robust measure of the distribution of margin values has remained open for some time. The paper answers this open question in the affirmative and, furthermore, the analysis leads to bounds that motivate the previously heuristic soft margin SVM algorithms as well as justifying the use of the quadratic loss in neural network training algorithms. The results are extended to give bounds for the probability of failing to achieve a target accuracy in regression prediction, with a statistical analysis of ridge regression and Gaussian processes as a special case. ...
John Shawe-Taylor, Nello Cristianini
Added 23 Dec 2010
Updated 23 Dec 2010
Type Journal
Year 2002
Where TIT
Authors John Shawe-Taylor, Nello Cristianini
Comments (0)