Sciweavers

Share
NCA
2006
IEEE

Evolutionary training of hardware realizable multilayer perceptrons

9 years 8 months ago
Evolutionary training of hardware realizable multilayer perceptrons
The use of multilayer perceptrons (MLP) with threshold functions (binary step function activations) greatly reduces the complexity of the hardware implementation of neural networks, provides tolerance to noise and improves the interpretation of the internal representations. In certain case, such as in learning stationary tasks, it may be sufficient to find appropriate weights for an MLP with threshold activation functions by software simulation and, then, transfer the weight values to the hardware implementation. Efficient training of these networks is a subject of considerable ongoing research. Methods available in the literature mainly focus on two-state (threshold) nodes and try to train the networks by approximating the gradient of the error function and modifying appropriately the gradient descent, or by progressively altering the shape of the activation functions. In this paper, we propose an evolution-motivated approach, which is eminently suitable for networks with threshold fu...
Vassilis P. Plagianakos, George D. Magoulas, Micha
Added 14 Dec 2010
Updated 14 Dec 2010
Type Journal
Year 2006
Where NCA
Authors Vassilis P. Plagianakos, George D. Magoulas, Michael N. Vrahatis
Comments (0)
books