Sciweavers

ACL
2015

The Fixed-Size Ordinally-Forgetting Encoding Method for Neural Network Language Models

8 years 14 days ago
The Fixed-Size Ordinally-Forgetting Encoding Method for Neural Network Language Models
In this paper, we propose the new fixedsize ordinally-forgetting encoding (FOFE) method, which can almost uniquely encode any variable-length sequence of words into a fixed-size representation. FOFE can model the word order in a sequence using a simple ordinally-forgetting mechanism according to the positions of words. In this work, we have applied FOFE to feedforward neural network language models (FNN-LMs). Experimental results have shown that without using any recurrent feedbacks, FOFE based FNNLMs can significantly outperform not only the standard fixed-input FNN-LMs but also the popular recurrent neural network (RNN) LMs.
Shiliang Zhang, Hui Jiang 0001, Mingbin Xu, Junfen
Added 13 Apr 2016
Updated 13 Apr 2016
Type Journal
Year 2015
Where ACL
Authors Shiliang Zhang, Hui Jiang 0001, Mingbin Xu, Junfeng Hou, Li-Rong Dai
Comments (0)