Sciweavers

NPL
2006

A Neural Model for Context-dependent Sequence Learning

13 years 4 months ago
A Neural Model for Context-dependent Sequence Learning
A novel neural network model is described that implements context-dependent learning of complex sequences. The model utilises leaky integrate-and-fire neurons to extract timing information from its input and modifies its weights using a learning rule with synaptic noise. Learning and recall phases are seamlessly integrated so that the network can gradually shift from learning to predicting its input. Experimental results using data from the real-world problem domain demonstrate that the use of context has three important benefits: (a) it prevents catastrophic interference during learning of multiple overlapping sequences, (b) it enables the completion of sequences from missing or noisy patterns, and (c) it provides a mechanism to selectively explore the space of learned sequences during free recall. Key words. contextual cueing, incidental learning, leaky integrate-and-fire neurons, recurrent neural network, sequence learning
Luc Berthouze, Adriaan G. Tijsseling
Added 14 Dec 2010
Updated 14 Dec 2010
Type Journal
Year 2006
Where NPL
Authors Luc Berthouze, Adriaan G. Tijsseling
Comments (0)