Sciweavers

Share
ANNPR
2006
Springer

A Convolutional Neural Network Tolerant of Synaptic Faults for Low-Power Analog Hardware

11 years 8 months ago
A Convolutional Neural Network Tolerant of Synaptic Faults for Low-Power Analog Hardware
Abstract. Recently, the authors described a training method for a convolutional neural network of threshold neurons. Hidden layers are trained by by clustering, in a feed-forward manner, while the output layer is trained using the supervised Perceptron rule. The system is designed for implementation on an existing low-power analog hardware architecture, exhibiting inherent error sources affecting the computation accuracy in unspecified ways. One key technique is to train the network on-chip, taking possible errors into account without any need to quantify them. For the hidden layers, an on-chip approach has been applied previously. In the present work, a chip-in-the-loop version of the iterative Perceptron rule is introduced for training the output layer. Influences of various types of errors are thoroughly investigated (noisy, deleted, and clamped weights) for all network layers, using the MNIST database of hand-written digits as a benchmark.
Johannes Fieres, Karlheinz Meier, Johannes Schemme
Added 13 Oct 2010
Updated 13 Oct 2010
Type Conference
Year 2006
Where ANNPR
Authors Johannes Fieres, Karlheinz Meier, Johannes Schemmel
Comments (0)
books