Sciweavers

ICAISC
2010
Springer

Pruning Classification Rules with Reference Vector Selection Methods

13 years 4 months ago
Pruning Classification Rules with Reference Vector Selection Methods
Attempts to extract logical rules from data often lead to large sets of classification rules that need to be pruned. Training two classifiers, the C4.5 decision tree and the Non-Nested Generalized Exemplars (NNGE) covering algorithm, on datasets that have been reduced earlier with the EkP instance compressor leads to statistically significantly lower number of derived rules with nonsignificant degradation of results. Similar results have been observed with other popular instance filters used for data pruning. Numerical experiments presented here illustrate that it is possible to extract more interesting and simpler sets of rules from filtered datasets. This enables a better understanding of knowledge structures when data is explored using algorithms that tend to induce a large number of classification rules.
Karol Grudzinski, Marek Grochowski, Wlodzislaw Duc
Added 07 Dec 2010
Updated 07 Dec 2010
Type Conference
Year 2010
Where ICAISC
Authors Karol Grudzinski, Marek Grochowski, Wlodzislaw Duch
Comments (0)