Sciweavers

NC
1998

Recurrent Neural Networks with Iterated Function Systems Dynamics

13 years 5 months ago
Recurrent Neural Networks with Iterated Function Systems Dynamics
We suggest a recurrent neural network (RNN) model with a recurrent part corresponding to iterative function systems (IFS) introduced by Barnsley 1] as a fractal image compression mechanism. The key idea is that 1) in our model we avoid learning the RNN state part by having non-trainable connections between the context and recurrent layers (this makes the training process less problematic and faster), 2) the RNN state part codes the information processing states in the symbolic input stream in a well-organized and intuitively appealing way. We show that there is a direct correspondence between the Renyi entropy spectra characterizing the input stream and the spectra of Renyi generalized dimensions of activations inside the RNN state space. We test both the new RNN model with IFS dynamics and its conventional counterpart with trainable recurrent part on two chaotic symbolic sequences. In our experiments, RNNs with IFS dynamics outperform the conventional RNNs with respect to information...
Peter Tiño, Georg Dorffner
Added 01 Nov 2010
Updated 01 Nov 2010
Type Conference
Year 1998
Where NC
Authors Peter Tiño, Georg Dorffner
Comments (0)