Free Online Productivity Tools
i2Speak
i2Symbol
i2OCR
iTex2Img
iWeb2Print
iWeb2Shot
i2Type
iPdf2Split
iPdf2Merge
i2Bopomofo
i2Arabic
i2Style
i2Image
i2PDF
iLatex2Rtf
Sci2ools

CISS

2008

IEEE

2008

IEEE

—In this paper we present an Information Theoretic Estimator for the number of sources mutually disjoint in a linear mixing model. The approach follows the Minimum Description Length prescription and is roughly equal to the sum of negative normalized maximum log-likelihood and the logarithm of number of sources. Preliminary numerical evidence supports this approach and compares favorabily to both the Akaike (AIC) and Bayesian (BIC) Information Criteria. I. THE MIXING MODEL AND SIGNALS Consider the following mixing model: xd(t) = L l=1 ad,lsl(t)+nd(t) , 1 ≤ d ≤ D , 1 ≤ t ≤ T (1) This model corresponds to an Instantaneous Linear Mixing Model with L souces and D sensors. We will frequently use the vector notation X(t) = (x1(t), . . . , xD(t))T , and matrix A = (ad,l). In this paper the following assumptions are made: 1) (H1) Noise signals (nd)1≤d≤D are Gaussian i.i.d. with zero mean and unknown variance σ2 ; 2) (H2) Source Signals are unknown, but at every moment t at most ...

Related Content

Added |
29 May 2010 |

Updated |
29 May 2010 |

Type |
Conference |

Year |
2008 |

Where |
CISS |

Authors |
Radu Balan |

Comments (0)