Sciweavers

115 search results - page 4 / 23
» Bayesian Learning of Loglinear Models for Neural Connectivit...
Sort
View
70
Voted
CONNECTION
2004
98views more  CONNECTION 2004»
14 years 9 months ago
Self-refreshing memory in artificial neural networks: learning temporal sequences without catastrophic forgetting
While humans forget gradually, highly distributed connectionist networks forget catastrophically: newly learned information often completely erases previously learned information. ...
Bernard Ans, Stephane Rousset, Robert M. French, S...
IJON
2006
123views more  IJON 2006»
14 years 9 months ago
Attractor neural networks with patchy connectivity
The neurons in the mammalian visual cortex are arranged in columnar structures, and the synaptic contacts of the pyramidal neurons in layer II/III are clustered into patches that ...
Christopher Johansson, Martin Rehn, Anders Lansner
66
Voted
ESANN
2000
14 years 11 months ago
An algorithm for the addition of time-delayed connections to recurrent neural networks
: Recurrent neural networks possess interesting universal approximation capabilities, making them good candidates for time series modeling. Unfortunately, long term dependencies ar...
Romuald Boné, Michel Crucianu, Jean Pierre ...
CONNECTION
2006
101views more  CONNECTION 2006»
14 years 9 months ago
High capacity, small world associative memory models
Models of associative memory usually have full connectivity or if diluted, random symmetric connectivity. In contrast, biological neural systems have predominantly local, non-symm...
Neil Davey, Lee Calcraft, Rod Adams
ECML
2003
Springer
15 years 2 months ago
Optimizing Local Probability Models for Statistical Parsing
Abstract. This paper studies the properties and performance of models for estimating local probability distributions which are used as components of larger probabilistic systems â€...
Kristina Toutanova, Mark Mitchell, Christopher D. ...