Sciweavers

Share
NIPS
2008

Relative Performance Guarantees for Approximate Inference in Latent Dirichlet Allocation

11 years 1 months ago
Relative Performance Guarantees for Approximate Inference in Latent Dirichlet Allocation
Hierarchical probabilistic modeling of discrete data has emerged as a powerful tool for text analysis. Posterior inference in such models is intractable, and practitioners rely on approximate posterior inference methods such as variational inference or Gibbs sampling. There has been much research in designing better approximations, but there is yet little theoretical understanding of which of the available techniques are appropriate, and in which data analysis settings. In this paper we provide the beginnings of such understanding. We analyze the improvement that the recently proposed collapsed variational inference (CVB) provides over mean field variational inference (VB) in latent Dirichlet allocation. We prove that the difference in the tightness of the bound on the likelihood of a document decreases as O(k-1)+ log m/m, where k is the number of topics in the model and m is the number of words in a document. As a consequence, the advantage of CVB over VB is lost for long documents b...
Indraneel Mukherjee, David M. Blei
Added 29 Oct 2010
Updated 29 Oct 2010
Type Conference
Year 2008
Where NIPS
Authors Indraneel Mukherjee, David M. Blei
Comments (0)
books