Sciweavers

ICANN
2005
Springer

Batch-Sequential Algorithm for Neural Networks Trained with Entropic Criteria

13 years 10 months ago
Batch-Sequential Algorithm for Neural Networks Trained with Entropic Criteria
The use of entropy as a cost function in the neural network learning phase usually implies that, in the back-propagation algorithm, the training is done in batch mode. Apart from the higher complexity of the algorithm in batch mode, we know that this approach has some limitations over the sequential mode. In this paper we present a way of combining both modes when using entropic criteria. We present some experiments that validates the proposed method and we also show some comparisons of this proposed method with the single batch mode algorithm.
Jorge M. Santos, Joaquim Marques de Sá, Lu&
Added 27 Jun 2010
Updated 27 Jun 2010
Type Conference
Year 2005
Where ICANN
Authors Jorge M. Santos, Joaquim Marques de Sá, Luís A. Alexandre
Comments (0)