Sciweavers

510 search results - page 7 / 102
» Complexity measures and decision tree complexity: a survey
Sort
View
ILP
2004
Springer
15 years 2 months ago
First Order Random Forests with Complex Aggregates
Random forest induction is a bagging method that randomly samples the feature set at each node in a decision tree. In propositional learning, the method has been shown to work well...
Celine Vens, Anneleen Van Assche, Hendrik Blockeel...
DAM
2002
67views more  DAM 2002»
14 years 9 months ago
Optimal arrangement of data in a tree directory
We define the decision problem data arrangement, which involves arranging the vertices of a graph G at the leaves of a d-ary tree so that a weighted sum of the distances between p...
Malwina J. Luczak, Steven D. Noble
ISCIS
2009
Springer
15 years 2 months ago
Calculating the VC-dimension of decision trees
—We propose an exhaustive search algorithm that calculates the VC-dimension of univariate decision trees with binary features. The VC-dimension of the univariate decision tree wi...
Ozlem Asian, Olcay Taner Yildiz, Ethem Alpaydin
KDD
1995
ACM
140views Data Mining» more  KDD 1995»
15 years 1 months ago
Decision Tree Induction: How Effective is the Greedy Heuristic?
Mostexisting decision tree systemsuse a greedyapproachto inducetrees -- locally optimalsplits are inducedat every node of the tree. Althoughthe greedy approachis suboptimal,it is ...
Sreerama K. Murthy, Steven Salzberg
COCO
1995
Springer
134views Algorithms» more  COCO 1995»
15 years 1 months ago
Towards Average-Case Complexity Analysis of NP Optimization Problems
For the worst-case complexity measure, if P = NP, then P = OptP, i.e., all NP optimization problems are polynomial-time solvable. On the other hand, it is not clear whether a simi...
Rainer Schuler, Osamu Watanabe