Free Online Productivity Tools
i2Speak
i2Symbol
i2OCR
iTex2Img
iWeb2Print
iWeb2Shot
i2Type
iPdf2Split
iPdf2Merge
i2Bopomofo
i2Arabic
i2Style
i2Image
i2PDF
iLatex2Rtf
Sci2ools

CORR

2011

Springer

2011

Springer

Let X be a non-negative random variable and let the conditional distribution of a random variable Y , given X, be Poisson(γ · X), for a parameter γ ≥ 0. We identify a natural loss function such that: • The derivative of the mutual information between X and Y with respect to γ is equal to the minimum mean loss in estimating X based on Y , regardless of the distribution of X. • When X ∼ P is estimated based on Y by a mismatched estimator that would have minimized the expected loss had X ∼ Q, the integral over all values of γ of the excess mean loss is equal to the relative entropy between P and Q. For a continuous time setting where XT = {Xt, 0 ≤ t ≤ T} is a non-negative stochastic process and the conditional law of Y T = {Yt, 0 ≤ t ≤ T}, given XT , is that of a non-homogeneous Poisson process with intensity function γ · XT , under the same loss function: • The minimum mean loss in causal ﬁltering when γ = γ0 is equal to the expected value of the minimum m...

Related Content

Added |
13 May 2011 |

Updated |
13 May 2011 |

Type |
Journal |

Year |
2011 |

Where |
CORR |

Authors |
Rami Atar, Tsachy Weissman |

Comments (0)