Sciweavers

ICPR
2008
IEEE

Prototype learning with margin-based conditional log-likelihood loss

14 years 5 months ago
Prototype learning with margin-based conditional log-likelihood loss
The classification performance of nearest prototype classifiers largely relies on the prototype learning algorithms, such as the learning vector quantization (LVQ) and the minimum classification error (MCE). This paper proposes a new prototype learning algorithm based on the minimization of a conditional log-likelihood loss (CLL), called log-likelihood of margin (LOGM). A regularization term is added to avoid over-fitting in training. The CLL loss in LOGM is a convex function of margin, and so, gives better convergence than the MCE algorithm. Our empirical study on a large suite of benchmark datasets demonstrates that the proposed algorithm yields higher accuracies than the MCE, the generalized LVQ (GLVQ), and the soft nearest prototype classifier (SNPC).
Cheng-Lin Liu, Xiaobo Jin, Xinwen Hou
Added 05 Nov 2009
Updated 06 Nov 2009
Type Conference
Year 2008
Where ICPR
Authors Cheng-Lin Liu, Xiaobo Jin, Xinwen Hou
Comments (0)