Sciweavers

86 search results - page 2 / 18
» Visualizing Bagged Decision Trees
Sort
View
JMLR
2002
102views more  JMLR 2002»
13 years 4 months ago
Efficient Algorithms for Decision Tree Cross-validation
Cross-validation is a useful and generally applicable technique often employed in machine learning, including decision tree induction. An important disadvantage of straightforward...
Hendrik Blockeel, Jan Struyf
IDA
2007
Springer
13 years 11 months ago
Combining Bagging and Random Subspaces to Create Better Ensembles
Random forests are one of the best performing methods for constructing ensembles. They derive their strength from two aspects: using random subsamples of the training data (as in b...
Pance Panov, Saso Dzeroski
ECAI
2008
Springer
13 years 7 months ago
MTForest: Ensemble Decision Trees based on Multi-Task Learning
Many ensemble methods, such as Bagging, Boosting, Random Forest, etc, have been proposed and widely used in real world applications. Some of them are better than others on noisefre...
Qing Wang, Liang Zhang, Mingmin Chi, Jiankui Guo
PR
2010
158views more  PR 2010»
13 years 3 months ago
Out-of-bag estimation of the optimal sample size in bagging
The performance of m-out-of-n bagging with and without replacement in terms of the sampling ratio (m/n) is analyzed. Standard bagging uses resampling with replacement to generate ...
Gonzalo Martínez-Muñoz, Alberto Su&a...
KDD
1999
ACM
185views Data Mining» more  KDD 1999»
13 years 9 months ago
Visual Classification: An Interactive Approach to Decision Tree Construction
Satisfying the basic requirements of accuracy and understandability of a classifier, decision tree classifiers have become very popular. Instead of constructing the decision tree ...
Mihael Ankerst, Christian Elsen, Martin Ester, Han...