Sciweavers

3143 search results - page 14 / 629
» Minimization of entropy functionals
Sort
View
IM
2006
15 years 1 months ago
Estimating Entropy and Entropy Norm on Data Streams
We consider the problem of computing information theoretic functions such as entropy on a data stream, using sublinear space. Our first result deals with a measure we call the &quo...
Amit Chakrabarti, Khanh Do Ba, S. Muthukrishnan
CORR
2010
Springer
127views Education» more  CORR 2010»
15 years 2 months ago
A note on concentration of submodular functions
We survey a few concentration inequalities for submodular and fractionally subadditive functions of independent random variables, implied by the entropy method for self-bounding f...
Jan Vondrák
ICML
2007
IEEE
16 years 2 months ago
Information-theoretic metric learning
In this paper, we present an information-theoretic approach to learning a Mahalanobis distance function. We formulate the problem as that of minimizing the differential relative e...
Jason V. Davis, Brian Kulis, Prateek Jain, Suvrit ...
IWANN
2001
Springer
15 years 6 months ago
Pattern Repulsion Revisited
Marques and Almeida [9] recently proposed a nonlinear data seperation technique based on the maximum entropy principle of Bell and Sejnowsky. The idea behind is a pattern repulsion...
Fabian J. Theis, Christoph Bauer, Carlos Garc&iacu...
TIT
1998
102views more  TIT 1998»
15 years 1 months ago
On Characterization of Entropy Function via Information Inequalities
—Given n discrete random variables = fX1;111; Xng, associated with any subset of f1; 2; 111; ng, there is a joint entropy H(X ) where X = fXi:i 2 g. This can be viewed as a f...
Zhen Zhang, Raymond W. Yeung