Student models for Intelligent Computer Assisted Language Learning (ICALL) have largely focused on the acquisition of grammatical structures. In this paper, we motivate a broader p...
: Consistent with the ultimate goals of AGI, we can expect that deductive consequences of large and grammatically varied text bases would not be generated by sequential application...
The ability to compress sentences while preserving their grammaticality and most of their meaning has recently received much attention. Our work views sentence compression as an o...
We propose a new model of human concept learning that provides a rational analysis for learning of feature-based concepts. This model is built upon Bayesian inference for a gramma...
Noah D. Goodman, Joshua B. Tenenbaum, Jacob Feldma...
We report grammar inference experiments on partially parsed sentences taken from the Wall Street Journal corpus using the inside-outside algorithm for stochastic context-free gram...