Sciweavers

251 search results - page 24 / 51
» A Greedy EM Algorithm for Gaussian Mixture Learning
Sort
View
ML
2000
ACM
124views Machine Learning» more  ML 2000»
15 years 1 months ago
Text Classification from Labeled and Unlabeled Documents using EM
This paper shows that the accuracy of learned text classifiers can be improved by augmenting a small number of labeled training documents with a large pool of unlabeled documents. ...
Kamal Nigam, Andrew McCallum, Sebastian Thrun, Tom...
TASLP
2010
159views more  TASLP 2010»
14 years 8 months ago
Under-Determined Reverberant Audio Source Separation Using a Full-Rank Spatial Covariance Model
This article addresses the modeling of reverberant recording environments in the context of under-determined convolutive blind source separation. We model the contribution of each ...
Ngoc Q. K. Duong, Emmanuel Vincent, Rémi Gr...
NIPS
2007
15 years 3 months ago
Using Deep Belief Nets to Learn Covariance Kernels for Gaussian Processes
We show how to use unlabeled data and a deep belief net (DBN) to learn a good covariance kernel for a Gaussian process. We first learn a deep generative model of the unlabeled da...
Ruslan Salakhutdinov, Geoffrey E. Hinton
COLT
2005
Springer
15 years 7 months ago
On Spectral Learning of Mixtures of Distributions
We consider the problem of learning mixtures of distributions via spectral methods and derive a tight characterization of when such methods are useful. Specifically, given a mixt...
Dimitris Achlioptas, Frank McSherry
ICML
2007
IEEE
16 years 2 months ago
Quadratically gated mixture of experts for incomplete data classification
We introduce quadratically gated mixture of experts (QGME), a statistical model for multi-class nonlinear classification. The QGME is formulated in the setting of incomplete data,...
Xuejun Liao, Hui Li, Lawrence Carin