Join Our Newsletter

Free Online Productivity Tools
i2Speak
i2Symbol
i2OCR
iTex2Img
iWeb2Print
iWeb2Shot
i2Type
iPdf2Split
iPdf2Merge
i2Bopomofo
i2Pinyin
i2Cantonese
i2Cangjie
i2Arabic
i2Style
i2Image
i2PDF
iLatex2Rtf
Sci2ools

ALT

2002

Springer

2002

Springer

Abstract. We consider a problem that is related to the “Universal Encoding Problem” from information theory. The basic goal is to ﬁnd rules that map “partial information” about a distribution X over an m-letter alphabet into a guess X for X such that the Kullback-Leibler divergence between X and X is as small as possible. The cost associated with a rule is the maximal expected Kullback-Leibler divergence between X and X. First, we show that the cost associated with the well-known add-one rule equals ln(1 + (m − 1)/(n + 1)) thereby extending a result of Forster and Warmuth [3, 2] to m ≥ 3. Second, we derive an absolute (as opposed to asymptotic) lower bound on the smallest possible cost. Technically, this is done by determining (almost exactly) the Bayes error of the add-one rule with a uniform prior (where the asymptotics for n → ∞ was known before). Third, we hint to tools from approximation theory and support the conjecture that there exists a rule whose cost asympt...

Added |
15 Mar 2010 |

Updated |
15 Mar 2010 |

Type |
Conference |

Year |
2002 |

Where |
ALT |

Authors |
Dietrich Braess, Jürgen Forster, Tomas Sauer, Hans-Ulrich Simon |

Comments (0)