Sciweavers

UAI
1996

Efficient Approximations for the Marginal Likelihood of Incomplete Data Given a Bayesian Network

13 years 5 months ago
Efficient Approximations for the Marginal Likelihood of Incomplete Data Given a Bayesian Network
We discuss Bayesian methods for learning Bayesian networks when data sets are incomplete. In particular, we examine asymptotic approximations for the marginal likelihood of incomplete data given a Bayesian network. We consider the Laplace approximation and the less accurate but more efficient BIC/MDL approximation. We also consider approximations proposed by Draper (1993) and Cheeseman and Stutz (1995). These approximations are as efficient as BIC/MDL, but their accuracy has not been studied in any depth. We compare the accuracy of these approximations under the assumption that the Laplace approximation is the most accurate. In experiments using synthetic data generated from discrete naive-Bayes models having a hidden root node, we find that (1) the BIC/MDL measure is the least accurate, having a bias in favor of simple models, and (2) the Draper and CS measures are the most accurate.
David Maxwell Chickering, David Heckerman
Added 02 Nov 2010
Updated 02 Nov 2010
Type Conference
Year 1996
Where UAI
Authors David Maxwell Chickering, David Heckerman
Comments (0)