Abstract. We apply Long Short-Term Memory (LSTM) recurrent neural networks to a large corpus of unprompted speech- the German part of the VERBMOBIL corpus. Training first on a fra...
Nicole Beringer, Alex Graves, Florian Schiel, J&uu...
Existing Recurrent Neural Networks (RNNs) are limited in their ability to model dynamical systems with nonlinearities and hidden internal states. Here we use our general framework...
Recurrent neural networks are theoretically capable of learning complex temporal sequences, but training them through gradient-descent is too slow and unstable for practical use i...
An efficient training method for block-diagonal recurrent neural networks is proposed. The method modifies the RPROP algorithm, originally developed for static models, in order to...
Paris A. Mastorocostas, Dimitris N. Varsamis, Cons...
— Neural networks are used in a wide number of fields including signal and image processing, modeling and control and pattern recognition. Some of the most common type of neural ...
Raveesh Kiran, Sandhya R. Jetti, Ganesh K. Venayag...
— A solution for the slow convergence of most learning rules for Recurrent Neural Networks (RNN) has been proposed under the terms Liquid State Machines (LSM) and Echo State Netw...
David Verstraeten, Benjamin Schrauwen, Dirk Stroob...
—This study proposes to generalize Hebbian learning by identifying and synchronizing the dynamical regimes of individual nodes in a recurrent network. The connection weights are ...
Abstract. Recurrent neural networks (RNNs) have proved effective at one dimensional sequence learning tasks, such as speech and online handwriting recognition. Some of the properti...
A neural network controller for improved fuel efficiency of the Toyota Prius hybrid electric vehicle is proposed. A new method to detect and mitigate a battery fault is also pres...
In this paper we present neuro-evolution of neural network controllers for mobile agents in a simulated environment. The controller is obtained through evolution of hypercube encod...