Sciweavers

HIS
2004

Adaptive Boosting with Leader based Learners for Classification of Large Handwritten Data

13 years 6 months ago
Adaptive Boosting with Leader based Learners for Classification of Large Handwritten Data
Boosting is a general method for improving the accuracy of a learning algorithm. AdaBoost, short form for Adaptive Boosting method, consists of repeated use of a weak or a base learning algorithm to find corresponding weak hypothesis by adapting to the error rates of the individual weak hypotheses. A large, complex handwritten data is under study. A repeated use of weak learner on the huge data results in large amount of processing time. In view of this, instead of using the entire training data for learning, we propose to use only prototypes. Further, in the current work, the base learner consists of a nearest neighbour classifier that employs prototypes generated using "leader" clustering algorithm. The leader algorithm is a single pass algorithm and is linear in terms of time as well as computation complexity. The prototype set alone is used as training data. In the process of developing an algorithm, domain knowledge of the Handwritten data, which is under study, is made...
T. Ravindra Babu, M. Narasimha Murty, Vijay K. Agr
Added 30 Oct 2010
Updated 30 Oct 2010
Type Conference
Year 2004
Where HIS
Authors T. Ravindra Babu, M. Narasimha Murty, Vijay K. Agrawal
Comments (0)