Sciweavers

ICML
2006
IEEE

Accelerated training of conditional random fields with stochastic gradient methods

14 years 5 months ago
Accelerated training of conditional random fields with stochastic gradient methods
We apply Stochastic Meta-Descent (SMD), a stochastic gradient optimization method with gain vector adaptation, to the training of Conditional Random Fields (CRFs). On several large data sets, the resulting optimizer converges to the same quality of solution over an order of magnitude faster than limited-memory BFGS, the leading method reported to date. We report results for both exact and inexact inference techniques.
S. V. N. Vishwanathan, Nicol N. Schraudolph, Mark
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2006
Where ICML
Authors S. V. N. Vishwanathan, Nicol N. Schraudolph, Mark W. Schmidt, Kevin P. Murphy
Comments (0)