Sciweavers

284 search results - page 8 / 57
» Importance Sampling for Continuous Time Bayesian Networks
Sort
View
ALT
2004
Springer
15 years 6 months ago
Relative Loss Bounds and Polynomial-Time Predictions for the k-lms-net Algorithm
We consider a two-layer network algorithm. The first layer consists of an uncountable number of linear units. Each linear unit is an LMS algorithm whose inputs are first “kerne...
Mark Herbster
NIPS
2003
14 years 10 months ago
On the Concentration of Expectation and Approximate Inference in Layered Networks
We present an analysis of concentration-of-expectation phenomena in layered Bayesian networks that use generalized linear models as the local conditional probabilities. This frame...
XuanLong Nguyen, Michael I. Jordan
156
Voted
UAI
2008
14 years 11 months ago
Hybrid Variational/Gibbs Collapsed Inference in Topic Models
Variational Bayesian inference and (collapsed) Gibbs sampling are the two important classes of inference algorithms for Bayesian networks. Both have their advantages and disadvant...
Max Welling, Yee Whye Teh, Bert Kappen
131
Voted
ESA
2010
Springer
246views Algorithms» more  ESA 2010»
14 years 10 months ago
Estimating the Average of a Lipschitz-Continuous Function from One Sample
We study the problem of estimating the average of a Lipschitz continuous function f defined over a metric space, by querying f at only a single point. More specifically, we explore...
Abhimanyu Das, David Kempe
AMAI
2008
Springer
14 years 9 months ago
Bayesian learning of Bayesian networks with informative priors
This paper presents and evaluates an approach to Bayesian model averaging where the models are Bayesian nets (BNs). Prior distributions are defined using stochastic logic programs...
Nicos Angelopoulos, James Cussens