Sciweavers

9 search results - page 1 / 2
» Mutual Information, Relative Entropy, and Estimation in the ...
Sort
View
CORR
2011
Springer
234views Education» more  CORR 2011»
12 years 12 months ago
Mutual Information, Relative Entropy, and Estimation in the Poisson Channel
Let X be a non-negative random variable and let the conditional distribution of a random variable Y , given X, be Poisson(γ · X), for a parameter γ ≥ 0. We identify a natural...
Rami Atar, Tsachy Weissman
TIT
2008
95views more  TIT 2008»
13 years 4 months ago
Mutual Information and Conditional Mean Estimation in Poisson Channels
Abstract--Following the discovery of a fundamental connection between information measures and estimation measures in Gaussian channels, this paper explores the counterpart of thos...
Dongning Guo, Shlomo Shamai, Sergio Verdú
DCC
2008
IEEE
14 years 4 months ago
The Rate-Distortion Function of a Poisson Process with a Queueing Distortion Measure
This paper presents a proof of the rate distortion function of a Poisson process with a queuing distortion measure that is in complete analogy with the proofs associated with the ...
Todd P. Coleman, Negar Kiyavash, Vijay G. Subraman...
TIT
2010
121views Education» more  TIT 2010»
12 years 11 months ago
Mismatched estimation and relative entropy
A random variable with distribution P is observed in Gaussian noise and is estimated by a minimum meansquare estimator that assumes that the distribution is Q. This paper shows tha...
Sergio Verdú
CORR
2010
Springer
122views Education» more  CORR 2010»
12 years 12 months ago
Concavity of Mutual Information Rate for Input-Restricted Finite-State Memoryless Channels at High SNR
We consider a finite-state memoryless channel with i.i.d. channel state and the input Markov process supported on a mixing finite-type constraint. We discuss the asymptotic behavio...
Guangyue Han, Brian H. Marcus