On the Convergence Properties of Contrastive Divergence

10 years 28 days ago
On the Convergence Properties of Contrastive Divergence
Contrastive Divergence (CD) is a popular method for estimating the parameters of Markov Random Fields (MRFs) by rapidly approximating an intractable term in the gradient of the log probability. Despite CD's empirical success, little is known about its theoretical convergence properties. In this paper, we analyze the CD1 update rule for Restricted Boltzmann Machines (RBMs) with binary variables. We show that this update is not the gradient of any function, and construct a counterintuitive "regularization function" that causes CD learning to cycle indefinitely. Nonetheless, we show that the regularized CD update has a fixed point for a large class of regularization functions using Brower's fixed point theorem.
Ilya Sutskever, Tijmen Tieleman
Added 19 May 2011
Updated 19 May 2011
Type Journal
Year 2010
Where JMLR
Authors Ilya Sutskever, Tijmen Tieleman
Comments (0)