Sciweavers

NIPS
2000

High-temperature Expansions for Learning Models of Nonnegative Data

13 years 5 months ago
High-temperature Expansions for Learning Models of Nonnegative Data
Recent work has exploited boundedness of data in the unsupervised learning of new types of generative model. For nonnegative data it was recently shown that the maximum-entropy generative model is a Nonnegative Boltzmann Distribution not a Gaussian distribution, when the model is constrained to match the first and second order statistics of the data. Learning for practical sized problems is made difficult by the need to compute expectations under the model distribution. The computational cost of Markov chain Monte Carlo methods and low fidelity of naive mean field techniques has led to increasing interest in advanced mean field theories and variational methods. Here I present a secondorder mean-field approximation for the Nonnegative Boltzmann Machine model, obtained using a "high-temperature" expansion. The theory is tested on learning a bimodal 2-dimensional model, a high-dimensional translationally invariant distribution, and a generative model for handwritten digits.
Oliver B. Downs
Added 01 Nov 2010
Updated 01 Nov 2010
Type Conference
Year 2000
Where NIPS
Authors Oliver B. Downs
Comments (0)