Sciweavers

KDD
2006
ACM

New EM derived from Kullback-Leibler divergence

14 years 5 months ago
New EM derived from Kullback-Leibler divergence
We introduce a new EM framework in which it is possible not only to optimize the model parameters but also the number of model components. A key feature of our approach is that we use nonparametric density estimation to improve parametric density estimation in the EM framework. While the classical EM algorithm estimates model parameters empirically using the data points themselves, we estimate them using nonparametric density estimates. There exist many possible applications that require optimal adjustment of model components. We present experimental results in two domains. One is polygonal approximation of laser range data, which is an active research topic in robot navigation. The other is grouping of edge pixels to contour boundaries, which still belongs to unsolved problems in computer vision. Categories and Subject Descriptors I.5 [Pattern Recognition]: General General Terms Algorithms, Performance, Experimentation Keywords EM, Expectation Maximization, Kullback-Leibler divergenc...
Longin Jan Latecki, Marc Sobel, Rolf Lakämper
Added 30 Nov 2009
Updated 30 Nov 2009
Type Conference
Year 2006
Where KDD
Authors Longin Jan Latecki, Marc Sobel, Rolf Lakämper
Comments (0)