Sciweavers

ECML
2004
Springer

Conditional Independence Trees

13 years 9 months ago
Conditional Independence Trees
It has been observed that traditional decision trees produce poor probability estimates. In many applications, however, a probability estimation tree (PET) with accurate probability estimates is desirable. Some researchers ascribe the poor probability estimates of decision trees to the decision tree learning algorithms. To our observation, however, the representation also plays an important role. Indeed, the representation of decision trees is fully expressive theoretically, but it is often impractical to learn such a representation with accurate probability estimates from limited training data. In this paper, we extend decision trees to represent a joint distribution and conditional independence, called conditional independence trees (CITrees), which is a more suitable model for PETs. We propose a novel algorithm for learning CITrees, and our experiments show that the CITree algorithm outperforms C4.5 and naive Bayes significantly in classification accuracy.
Harry Zhang, Jiang Su
Added 01 Jul 2010
Updated 01 Jul 2010
Type Conference
Year 2004
Where ECML
Authors Harry Zhang, Jiang Su
Comments (0)