Sciweavers

Share
ICDAR
2009
IEEE

Unsupervised Selection and Discriminative Estimation of Orthogonal Gaussian Mixture Models for Handwritten Digit Recognition

10 years 5 months ago
Unsupervised Selection and Discriminative Estimation of Orthogonal Gaussian Mixture Models for Handwritten Digit Recognition
The problem of determining the appropriate number of components is important in finite mixture modeling for pattern classification. This paper considers the application of an unsupervised clustering method called AutoClass to training of Orthogonal Gaussian Mixture Models (OGMM). Actually, the number of components in OGMM of each class is selected based on AutoClass. In this way, the structures of OGMM for difference classes are not necessarily be the same as those in usual modeling scheme, so that the dissimilarity between the data distributions of different classes can be described more exactly. After the model selection is completed, a discriminative learning framework of Bayesian classifiers called Max-Min posterior pseudoprobabilities (MMP) is employed to estimate component parameters in OGMM of each class. We apply the proposed learning approach of OGMM to handwritten digit recognition. The experimental results on the MNIST database show the effectiveness of our approach.
Xuefeng Chen, Xiabi Liu, Yunde Jia
Added 21 May 2010
Updated 21 May 2010
Type Conference
Year 2009
Where ICDAR
Authors Xuefeng Chen, Xiabi Liu, Yunde Jia
Comments (0)
books