Sciweavers

Share
ESWA
2006

An effective refinement strategy for KNN text classifier

11 years 9 months ago
An effective refinement strategy for KNN text classifier
Due to the exponential growth of documents on the Internet and the emergent need to organize them, the automated categorization of documents into predefined labels has received an ever-increased attention in the recent years. A wide range of supervised learning algorithms has been introduced to deal with text classification. Among all these classifiers, K-Nearest Neighbors (KNN) is a widely used classifier in text categorization community because of its simplicity and efficiency. However, KNN still suffers from inductive biases or model misfits that result from its assumptions, such as the presumption that training data are evenly distributed among all categories. In this paper, we propose a new refinement strategy, which we called as DragPushing, for the KNN Classifier. The experiments on three benchmark evaluation collections show that DragPushing achieved a significant improvement on the performance of the KNN Classifier. q 2005 Elsevier Ltd. All rights reserved.
Songbo Tan
Added 12 Dec 2010
Updated 12 Dec 2010
Type Journal
Year 2006
Where ESWA
Authors Songbo Tan
Comments (0)
books