Sciweavers

NIPS
2000

On Reversing Jensen's Inequality

13 years 5 months ago
On Reversing Jensen's Inequality
Jensen's inequality is a powerful mathematical tool and one of the workhorses in statistical learning. Its applications therein include the EM algorithm, Bayesian estimation and Bayesian inference. Jensen computes simple lower bounds on otherwise intractable quantities such as products of sums and latent log-likelihoods. This simplification then permits operations like integration and maximization. Quite often (i.e. in discriminative learning) upper bounds are needed as well. We derive and prove an efficient analytic inequality that provides such variational upper bounds. This inequality holds for latent variable mixtures of exponential family distributions and thus spans a wide range of contemporary statistical models. We also discuss applications of the upper bounds including maximum conditional likelihood,large margin discriminativemodels and conditional Bayesian inference. Convergence, efficiency and prediction results are shown. 1
Tony Jebara, Alex Pentland
Added 01 Nov 2010
Updated 01 Nov 2010
Type Conference
Year 2000
Where NIPS
Authors Tony Jebara, Alex Pentland
Comments (0)