Sciweavers

ASRU
2013

K-component recurrent neural network language models using curriculum learning

8 years 11 months ago
K-component recurrent neural network language models using curriculum learning
Conventional n-gram language models are known for their limited ability to capture long-distance dependencies and their brittleness with respect to within-domain variations. In this paper, we propose a k-component recurrent neural network language model using curriculum learning (CLKRNNLM) to address within-domain variations. Based on a Dutch-language corpus, we investigate three methods of curriculum learning that exploit dedicated component models for specific sub-domains. Under an oracle situation in which context information is known during testing, we experimentally test three hypotheses. The first is that domain-dedicated models perform better than general models on their specific domains. The second is that curriculum learning can be used to train recurrent neural network language models (RNNLMs) from general patterns to specific patterns. The third is that curriculum learning, used as an implicit weighting method to adjust the relative contributions of general and specifi...
Yangyang Shi, Martha Larson, Catholijn M. Jonker
Added 19 May 2015
Updated 19 May 2015
Type Journal
Year 2013
Where ASRU
Authors Yangyang Shi, Martha Larson, Catholijn M. Jonker
Comments (0)