Sciweavers

Share
ICML
2007
IEEE

Pegasos: Primal Estimated sub-GrAdient SOlver for SVM

10 years 6 days ago
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
We describe and analyze a simple and effective iterative algorithm for solving the optimization problem cast by Support Vector Machines (SVM). Our method alternates between stochastic gradient descent steps and projection steps. We prove that the number of iterations required to obtain a solution of accuracy is ~O(1/). In contrast, previous analyses of stochastic gradient descent methods require (1/2 ) iterations. As in previously devised SVM solvers, the number of iterations also scales linearly with 1/, where is the regularization parameter of SVM. For a linear kernel, the total run-time of our method is ~O(d/()), where d is a bound on the number of non-zero features in each example. Since the run-time does not depend directly on the size of the training set, the resulting algorithm is especially suited for learning from large datasets. Our approach can seamlessly be adapted to employ non-linear kernels while working solely on the primal objective function. We demonstrate the effic...
Shai Shalev-Shwartz, Yoram Singer, Nathan Srebro
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2007
Where ICML
Authors Shai Shalev-Shwartz, Yoram Singer, Nathan Srebro
Comments (0)
books