Sciweavers

15 search results - page 1 / 3
» Learning with Blocks: Composite Likelihood and Contrastive D...
Sort
View
JMLR
2010
165views more  JMLR 2010»
12 years 11 months ago
Learning with Blocks: Composite Likelihood and Contrastive Divergence
Composite likelihood methods provide a wide spectrum of computationally efficient techniques for statistical tasks such as parameter estimation and model selection. In this paper,...
Arthur Asuncion, Qiang Liu, Alexander T. Ihler, Pa...
ICML
2008
IEEE
14 years 5 months ago
Training restricted Boltzmann machines using approximations to the likelihood gradient
A new algorithm for training Restricted Boltzmann Machines is introduced. The algorithm, named Persistent Contrastive Divergence, is different from the standard Contrastive Diverg...
Tijmen Tieleman
ICML
2010
IEEE
13 years 6 months ago
Particle Filtered MCMC-MLE with Connections to Contrastive Divergence
Learning undirected graphical models such as Markov random fields is an important machine learning task with applications in many domains. Since it is usually intractable to learn...
Arthur Asuncion, Qiang Liu, Alexander T. Ihler, Pa...
NIPS
2003
13 years 6 months ago
Wormholes Improve Contrastive Divergence
In models that define probabilities via energies, maximum likelihood learning typically involves using Markov Chain Monte Carlo to sample from the model’s distribution. If the ...
Geoffrey E. Hinton, Max Welling, Andriy Mnih
ICML
2010
IEEE
13 years 6 months ago
Non-Local Contrastive Objectives
Pseudo-likelihood and contrastive divergence are two well-known examples of contrastive methods. These algorithms trade off the probability of the correct label with the probabili...
David Vickrey, Cliff Chiung-Yu Lin, Daphne Koller