Sciweavers

JMLR
2010

Learning with Blocks: Composite Likelihood and Contrastive Divergence

12 years 11 months ago
Learning with Blocks: Composite Likelihood and Contrastive Divergence
Composite likelihood methods provide a wide spectrum of computationally efficient techniques for statistical tasks such as parameter estimation and model selection. In this paper, we present a formal connection between the optimization of composite likelihoods and the well-known contrastive divergence algorithm. In particular, we show that composite likelihoods can be stochastically optimized by performing a variant of contrastive divergence with random-scan blocked Gibbs sampling. By using higher-order composite likelihoods, our proposed learning framework makes it possible to trade off computation time for increased accuracy. Furthermore, one can choose composite likelihood blocks that match the model's dependence structure, making the optimization of higher-order composite likelihoods computationally efficient. We empirically analyze the performance of blocked contrastive divergence on various models, including visible Boltzmann machines, conditional random fields, and exponen...
Arthur Asuncion, Qiang Liu, Alexander T. Ihler, Pa
Added 19 May 2011
Updated 19 May 2011
Type Journal
Year 2010
Where JMLR
Authors Arthur Asuncion, Qiang Liu, Alexander T. Ihler, Padhraic Smyth
Comments (0)