Sciweavers

ECML
2007
Springer

Nondeterministic Discretization of Weights Improves Accuracy of Neural Networks

13 years 10 months ago
Nondeterministic Discretization of Weights Improves Accuracy of Neural Networks
Abstract. The paper investigates modification of backpropagation algorithm, consisting of discretization of neural network weights after each training cycle. This modification, aimed at overfitting reduction, restricts the set of possible values of weights to a discrete subset of real numbers, leading to much better generalization abilities of the network. This, in turn, leads to higher accuracy and a decrease in error rate by over 50% in extreme cases (when overfitting is high). Discretization is performed nondeterministically, so as to keep expected value of discretized weight equal to original value. In this way, global behavior of original algorithm is preserved. The presented method of discretization is general and may be applied to other machine-learning algorithms. It is also an example of how an algorithm for continuous optimization can be successfully applied to optimization over discrete spaces. The method was evaluated experimentally in WEKA environment using two real-wo...
Marcin Wojnarski
Added 07 Jun 2010
Updated 07 Jun 2010
Type Conference
Year 2007
Where ECML
Authors Marcin Wojnarski
Comments (0)