Sciweavers

ICONIP
2004

Variational Information Maximization for Neural Coding

13 years 5 months ago
Variational Information Maximization for Neural Coding
Abstract. Mutual Information (MI) is a long studied measure of coding efficiency, and many attempts to apply it to population coding have been made. However, this is a computationally intractable task, and most previous studies redefine the criterion in forms of approximations. Recently we described properties of a simple lower bound on MI [2]. Here we describe the bound optimization procedure for learning of population codes in a simple point neural model. We compare our approach with other techniques maximizing approximations of MI, focusing on a comparison with the Fisher Information criterion.
Felix V. Agakov, David Barber
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2004
Where ICONIP
Authors Felix V. Agakov, David Barber
Comments (0)