Sciweavers

Share
ICA
2004
Springer

ICA Using Kernel Entropy Estimation with NlogN Complexity

10 years 5 months ago
ICA Using Kernel Entropy Estimation with NlogN Complexity
Abstract. Mutual information (MI) is a common criterion in independent component analysis (ICA) optimization. MI is derived from probability density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing nonparametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separation performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN log N + K2 N).
Sarit Shwartz, Michael Zibulevsky, Yoav Y. Schechn
Added 01 Jul 2010
Updated 01 Jul 2010
Type Conference
Year 2004
Where ICA
Authors Sarit Shwartz, Michael Zibulevsky, Yoav Y. Schechner
Comments (0)
books