This paper considers the least-square online gradient descent algorithm in a reproducing kernel Hilbert space (RKHS) without explicit regularization. We present a novel capacity i...
We study a general online convex optimization problem. We have a convex set S and an unknown sequence of cost functions c1, c2, . . . , and in each period, we choose a feasible po...
Abraham Flaxman, Adam Tauman Kalai, H. Brendan McM...
We present a new method for regularized convex optimization and analyze it under both online and stochastic optimization settings. In addition to unifying previously known firstor...
John Duchi, Shai Shalev-Shwartz, Yoram Singer, Amb...
This paper presents an online support vector machine (SVM) that uses the stochastic meta-descent (SMD) algorithm to adapt its step size automatically. We formulate the online lear...
S. V. N. Vishwanathan, Nicol N. Schraudolph, Alex ...
Stochastic gradient descent (SGD) uses approximate gradients estimated from subsets of the training data and updates the parameters in an online fashion. This learning framework i...