Sciweavers

Share
JMLR
2010

Noise-contrastive estimation: A new estimation principle for unnormalized statistical models

8 years 4 months ago
Noise-contrastive estimation: A new estimation principle for unnormalized statistical models
We present a new estimation principle for parameterized statistical models. The idea is to perform nonlinear logistic regression to discriminate between the observed data and some artificially generated noise, using the model log-density function in the regression nonlinearity. We show that this leads to a consistent (convergent) estimator of the parameters, and analyze the asymptotic variance. In particular, the method is shown to directly work for unnormalized models, i.e. models where the density function does not integrate to one. The normalization constant can be estimated just like any other parameter. For a tractable ICA model, we compare the method with other estimation methods that can be used to learn unnormalized models, including score matching, contrastive divergence, and maximum-likelihood where the normalization constant is estimated with importance sampling. Simulations show that noise-contrastive estimation offers the best trade-off between computational and statistic...
Michael Gutmann, Aapo Hyvärinen
Added 19 May 2011
Updated 19 May 2011
Type Journal
Year 2010
Where JMLR
Authors Michael Gutmann, Aapo Hyvärinen
Comments (0)
books