Sciweavers

4079 search results - page 110 / 816
» Neural Networks
Sort
View
110
Voted
GECCO
2005
Springer
155views Optimization» more  GECCO 2005»
15 years 9 months ago
Co-evolving recurrent neurons learn deep memory POMDPs
Recurrent neural networks are theoretically capable of learning complex temporal sequences, but training them through gradient-descent is too slow and unstable for practical use i...
Faustino J. Gomez, Jürgen Schmidhuber
NCA
1998
IEEE
15 years 3 months ago
A Neural Network Model of a Communication Network with Information Servers
This paper models information flow in a communication network. The network consists of nodes that communicate with each other, and information servers that have a predominantly o...
Philippe De Wilde
133
Voted
NN
2000
Springer
145views Neural Networks» more  NN 2000»
15 years 3 months ago
Best approximation by Heaviside perceptron networks
In Lp-spaces with p [1, ) there exists a best approximation mapping to the set of functions computable by Heaviside perceptron networks with n hidden units; however for p (1, ) ...
Paul C. Kainen, Vera Kurková, Andrew Vogt
129
Voted
TNN
1998
92views more  TNN 1998»
15 years 3 months ago
Inductive inference from noisy examples using the hybrid finite state filter
—Recurrent neural networks processing symbolic strings can be regarded as adaptive neural parsers. Given a set of positive and negative examples, picked up from a given language,...
Marco Gori, Marco Maggini, Enrico Martinelli, Giov...
IJON
2008
88views more  IJON 2008»
15 years 3 months ago
Neural network construction and training using grammatical evolution
The term neural network evolution usually refers to network topology evolution leaving the network's parameters to be trained using conventional algorithms. In this paper we ...
Ioannis G. Tsoulos, Dimitris Gavrilis, Euripidis G...