Sciweavers

Share
ICML
1996
IEEE

Discretizing Continuous Attributes While Learning Bayesian Networks

10 years 21 days ago
Discretizing Continuous Attributes While Learning Bayesian Networks
We introduce a method for learning Bayesian networks that handles the discretization of continuous variables as an integral part of the learning process. The main ingredient in this method is a new metric based on the Minimal Description Length principle for choosing the threshold values for the discretization while learning the Bayesian network structure. This score balances the complexity of the learned discretization and the learned network structure against how well they model the training data. This ensures that the discretization of each variable introduces just enough intervals to capture its interaction with adjacent variables in the network. We formally derive the new metric, study its main properties, and propose an iterative algorithm for learning a discretization policy. Finally, we illustrate its behavior in applications to supervised learning.
Moisés Goldszmidt, Nir Friedman
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 1996
Where ICML
Authors Moisés Goldszmidt, Nir Friedman
Comments (0)
books