We describe a framework for inducing probabilistic grammars from corpora of positive samples. First, samples are incorporated by adding ad-hoc rules to a working grammar; subseque...
This paper concerns learning and prediction with probabilistic models where the domain sizes of latent variables have no a priori upper-bound. Current approaches represent prior d...
This study investigates Bayes classification of online Arabic characters using histograms of tangent differences and Gibbs modeling of the class-conditional probability density fun...
We propose to model the image differentials of astrophysical source maps by Student's t-distribution and to use them in the Bayesian source separation method as priors. We int...
We propose a simple generative, syntactic language model that conditions on overlapping windows of tree context (or treelets) in the same way that n-gram language models condition...