Sciweavers

Share
ICASSP
2011
IEEE

Maximum marginal likelihood estimation for nonnegative dictionary learning

8 years 11 months ago
Maximum marginal likelihood estimation for nonnegative dictionary learning
We describe an alternative to standard nonnegative matrix factorisation (NMF) for nonnegative dictionary learning. NMF with the Kullback-Leibler divergence can be seen as maximisation of the joint likelihood of the dictionary and the expansion coefficients under Poisson observation noise. This approach lacks optimality because the number of parameters (which include the expansion coefficients) grows with the number of observations. As such, we describe a variational EM algorithm for optimisation of the marginal likelihood, i.e., the likelihood of the dictionary where the expansion coefficients have been integrated out (given a Gamma conjugate prior). We compare the output of both maximum joint likelihood estimation (i.e., standard NMF) and maximum marginal likelihood estimation (MMLE) on real and synthetical data. The MMLE approach is shown to embed automatic model order selection, similar to automatic relevance determination.
Onur Dikmen, Cédric Févotte
Added 21 Aug 2011
Updated 21 Aug 2011
Type Journal
Year 2011
Where ICASSP
Authors Onur Dikmen, Cédric Févotte
Comments (0)
books