Sciweavers

Share
ICASSP
2011
IEEE

Empirical divergence maximization for quantizer design: An analysis of approximation error

9 years 3 months ago
Empirical divergence maximization for quantizer design: An analysis of approximation error
Empirical divergence maximization is an estimation method similar to empirical risk minimization whereby the Kullback-Leibler divergence is maximized over a class of functions that induce probability distributions. We use this method as a design strategy for quantizers whose output will ultimately be used to make a decision about the quantizer’s input. We derive this estimator’s approximation error decay rate as a function of the resolution of a class of partitions known as recursive dyadic partitions. This result, coupled with earlier results, show that this estimator can converge to the theoretically optimal solution as fast as n−1 , where n is the number of training samples. This estimator also is capable of producing estimates that well-approximate optimal solutions that existing techniques cannot.
Michael A. Lexa
Added 21 Aug 2011
Updated 21 Aug 2011
Type Journal
Year 2011
Where ICASSP
Authors Michael A. Lexa
Comments (0)
books