Sciweavers

530 search results - page 16 / 106
» Dirichlet Process Mixtures of Generalized Linear Models
Sort
View
SIGIR
2003
ACM
15 years 7 months ago
On an equivalence between PLSI and LDA
Latent Dirichlet Allocation (LDA) is a fully generative approach to language modelling which overcomes the inconsistent generative semantics of Probabilistic Latent Semantic Index...
Mark Girolami, Ata Kabán
DAGM
2010
Springer
15 years 3 months ago
Gaussian Mixture Modeling with Gaussian Process Latent Variable Models
Density modeling is notoriously difficult for high dimensional data. One approach to the problem is to search for a lower dimensional manifold which captures the main characteristi...
Hannes Nickisch, Carl Edward Rasmussen
ICIP
2007
IEEE
16 years 3 months ago
Faithful Shape Representation for 2D Gaussian Mixtures
It has been recently discovered that a faithful representation for the shape of some simple distributions can be constructed using invariant statistics [1, 2]. In this paper, we c...
Mireille Boutin, Mary I. Comer
115
Voted
ICASSP
2008
IEEE
15 years 8 months ago
Unsupervised language model adaptation via topic modeling based on named entity hypotheses
Language model (LM) adaptation is often achieved by combining a generic LM with a topic-specific model that is more relevant to the target document. Unlike previous work on unsup...
Yang Liu, Feifan Liu
JMLR
2010
152views more  JMLR 2010»
14 years 8 months ago
Bayesian Generalized Kernel Models
We propose a fully Bayesian approach for generalized kernel models (GKMs), which are extensions of generalized linear models in the feature space induced by a reproducing kernel. ...
Zhihua Zhang, Guang Dai, Donghui Wang, Michael I. ...