Sciweavers

57 search results - page 1 / 12
» On Characterization of Entropy Function via Information Ineq...
Sort
View
TIT
1998
102views more  TIT 1998»
13 years 4 months ago
On Characterization of Entropy Function via Information Inequalities
—Given n discrete random variables = fX1;111; Xng, associated with any subset of f1; 2; 111; ng, there is a joint entropy H(X ) where X = fXi:i 2 g. This can be viewed as a f...
Zhen Zhang, Raymond W. Yeung
CORR
2007
Springer
105views Education» more  CORR 2007»
13 years 4 months ago
A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
— While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon’s entropy power inequality (EPI) seems...
Olivier Rioul
JMLR
2010
173views more  JMLR 2010»
12 years 11 months ago
Collaborative Filtering via Rating Concentration
While most popular collaborative filtering methods use low-rank matrix factorization and parametric density assumptions, this article proposes an approach based on distribution-fr...
Bert Huang, Tony Jebara
CAS
2008
118views more  CAS 2008»
13 years 5 months ago
A Novel Method for Measuring the Structural Information Content of Networks
In this paper we first present a novel approach to determine the structural information content (graph entropy) of a network represented by an undirected and connected graph. Such...
Matthias Dehmer
MP
2010
126views more  MP 2010»
13 years 3 months ago
Two row mixed-integer cuts via lifting
Recently, Andersen et al. [1], Borozan and Cornu´ejols [7] and Cornu´ejols and Margot [10] characterized extreme inequalities of a system of two rows with two free integer varia...
Santanu S. Dey, Laurence A. Wolsey