Adaptor grammars (Johnson et al., 2007b) are a non-parametric Bayesian extension of Probabilistic Context-Free Grammars (PCFGs) which in effect learn the probabilities of entire s...
This article demonstrates the potential of using hierarchical Bayesian methods to relate models and data in the cognitive sciences. This is done using a worked example that consid...
coarse procedures or very abstract frames from the point of view of algorithm, because some crucial issues like the representation, evolution, storage, and learning process of conc...
Generative models of pattern individuality attempt to represent the distribution of observed quantitative features, e.g., by learning parameters from a database, and then use such...
This paper presents an extensive evaluation, on artificial datasets, of EDY, an unsupervised algorithm for automatically synthesizing a Structured Hidden Markov Model (S-HMM) from ...