Sciweavers

Share
ECAI
2008
Springer

MTForest: Ensemble Decision Trees based on Multi-Task Learning

8 years 4 months ago
MTForest: Ensemble Decision Trees based on Multi-Task Learning
Many ensemble methods, such as Bagging, Boosting, Random Forest, etc, have been proposed and widely used in real world applications. Some of them are better than others on noisefree data while some of them are better than others on noisy data. But in reality, ensemble methods that can consistently gain good performance in situations with or without noise are more desirable. In this paper, we propose a new method namely MTForest, to ensemble decision tree learning algorihms by enumerating each input attribute as extra task to introduce different additional inductive bias to generate diverse yet accurate component decision tree learning algorithms in the ensemble. The experimental results show that in situations without classification noise, MTForest is comparable to Boosting and Random Forest and significantly better than Bagging, while in situations with classification noise, MTForest is significantly better than Boosting and Random Forest and is slightly better than Bagging. So MTFore...
Qing Wang, Liang Zhang, Mingmin Chi, Jiankui Guo
Added 19 Oct 2010
Updated 19 Oct 2010
Type Conference
Year 2008
Where ECAI
Authors Qing Wang, Liang Zhang, Mingmin Chi, Jiankui Guo
Comments (0)
books