Sciweavers

565 search results - page 1 / 113
» A Generalized Quadratic Loss for Support Vector Machines
Sort
View
ECAI
2004
Springer
13 years 10 months ago
A Generalized Quadratic Loss for Support Vector Machines
The standard SVM formulation for binary classification is based on the Hinge loss function, where errors are considered not correlated. Due to this, local information in the featu...
Filippo Portera, Alessandro Sperduti
INFORMATICALT
2011
91views more  INFORMATICALT 2011»
12 years 11 months ago
A Quadratic Loss Multi-Class SVM for which a Radius-Margin Bound Applies
To set the values of the hyperparameters of a support vector machine (SVM), the method of choice is cross-validation. Several upper bounds on the leave-one-out error of the pattern...
Yann Guermeur, Emmanuel Monfrini
ALT
2000
Springer
14 years 1 months ago
On the Noise Model of Support Vector Machines Regression
Abstract. Support Vector Machines Regression (SVMR) is a learning technique where the goodness of fit is measured not by the usual quadratic loss function (the mean square error),...
Massimiliano Pontil, Sayan Mukherjee, Federico Gir...
NIPS
2008
13 years 6 months ago
Support Vector Machines with a Reject Option
We consider the problem of binary classification where the classifier may abstain instead of classifying each observation. The Bayes decision rule for this setup, known as Chow�...
Yves Grandvalet, Alain Rakotomamonjy, Joseph Keshe...
ALT
2004
Springer
14 years 1 months ago
Convergence of a Generalized Gradient Selection Approach for the Decomposition Method
The decomposition method is currently one of the major methods for solving the convex quadratic optimization problems being associated with support vector machines. For a special c...
Nikolas List