Sciweavers

Share
ACL
2015

Recurrent Neural Network based Rule Sequence Model for Statistical Machine Translation

3 years 7 months ago
Recurrent Neural Network based Rule Sequence Model for Statistical Machine Translation
The inability to model long-distance dependency has been handicapping SMT for years. Specifically, the context independence assumption makes it hard to capture the dependency between translation rules. In this paper, we introduce a novel recurrent neural network based rule sequence model to incorporate arbitrary long contextual information during estimating probabilities of rule sequences. Moreover, our model frees the translation model from keeping huge and redundant grammars, resulting in more efficient training and decoding. Experimental results show that our method achieves a 0.9 point BLEU gain over the baseline, and a significant reduction in rule table size for both phrase-based and hierarchical phrase-based systems.
Heng Yu, Xuan Zhu
Added 13 Apr 2016
Updated 13 Apr 2016
Type Journal
Year 2015
Where ACL
Authors Heng Yu, Xuan Zhu
Comments (0)
books