Sciweavers

NN
1998
Springer

How embedded memory in recurrent neural network architectures helps learning long-term temporal dependencies

13 years 4 months ago
How embedded memory in recurrent neural network architectures helps learning long-term temporal dependencies
Learning long-term temporal dependencies with recurrent neural networks can be a difficult problem. It has recently been shown that a class of recurrent neural networks called NARX networks perform much better than conventional recurrent neural networks for learning certain simple long-term dependency problems. The intuitive explanation for this behavior is that the output memories of a NARX network can be manifested as jump-ahead connections in the time-unfolded network. These jump-ahead connections can propagate gradient information more efficiently, thus reducing the sensitivity of the network to long-term dependencies. This work gives empirical justification to our hypothesis that similar improvements in learning long-term dependencies can be achieved with other classes of recurrent neural network axchitectures simply by increasing the order of the embedded memory. In particular we explore the impact of learning simple long-term dependency problems on three classes of recurrent...
Tsungnan Lin, Bill G. Horne, C. Lee Giles
Added 22 Dec 2010
Updated 22 Dec 2010
Type Journal
Year 1998
Where NN
Authors Tsungnan Lin, Bill G. Horne, C. Lee Giles
Comments (0)