Sciweavers

NECO
2002

Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM

13 years 4 months ago
Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM
In response to Rodriguez' recent article (2001) we compare the performance of simple recurrent nets and "Long Short-Term Memory" (LSTM) recurrent nets on context-free and contextsensitive languages. Rodriguez (2001) examined the learning ability of simple recurrent nets (SRNs) (Elman, 1990) on simple context sensitive and context free languages (CFLs and CSLs). He trained his SRN on short training sequences of length not much greater than 10. He found that the SRN does not generalize well on significantly larger test sets, and frequently is unable to store the training set. Similar results were recently reported by Bod
Jürgen Schmidhuber, Felix A. Gers, Douglas Ec
Added 22 Dec 2010
Updated 22 Dec 2010
Type Journal
Year 2002
Where NECO
Authors Jürgen Schmidhuber, Felix A. Gers, Douglas Eck
Comments (0)