—Given n discrete random variables = fX1;111; Xng, associated with any subset of f1; 2; 111; ng, there is a joint entropy H(X) where X = fXi:i 2 g. This can be viewed as a f...
— While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon’s entropy power inequality (EPI) seems...
While most popular collaborative filtering methods use low-rank matrix factorization and parametric density assumptions, this article proposes an approach based on distribution-fr...
In this paper we first present a novel approach to determine the structural information content (graph entropy) of a network represented by an undirected and connected graph. Such...
Recently, Andersen et al. [1], Borozan and Cornu´ejols [7] and Cornu´ejols and Margot [10] characterized extreme inequalities of a system of two rows with two free integer varia...