Sciweavers

ECML
2005
Springer

A Comparison of Approaches for Learning Probability Trees

13 years 10 months ago
A Comparison of Approaches for Learning Probability Trees
Probability trees (or Probability Estimation Trees, PET’s) are decision trees with probability distributions in the leaves. Several alternative approaches for learning probability trees have been proposed but no thorough comparison of these approaches exists. In this paper we experimentally compare the main approaches using the relational decision tree learner Tilde (both on non-relational and on relational datasets). Next to the main existing approaches, we also consider a novel variant of an existing approach based on the Bayesian Information Criterion (BIC). Our main conclusion is that overall trees built using the C4.5-approach or the C4.4-approach (C4.5 without postpruning) have the best predictive performance. If the number of classes is low, however, BIC performs equally well. An additional advantage of BIC is that its trees are considerably smaller than trees for the C4.5- or C4.4-approach.
Daan Fierens, Jan Ramon, Hendrik Blockeel, Maurice
Added 27 Jun 2010
Updated 27 Jun 2010
Type Conference
Year 2005
Where ECML
Authors Daan Fierens, Jan Ramon, Hendrik Blockeel, Maurice Bruynooghe
Comments (0)