Sciweavers

NIPS
1990

Convergence of a Neural Network Classifier

13 years 5 months ago
Convergence of a Neural Network Classifier
In this paper, we show that the LVQ learning algorithm converges to locally asymptotic stable equilibria of an ordinary differential equation. We show that the learning algorithm performs stochastic approximation. Convergence of the Voronoi vectors is guaranteed under the appropriate conditions on the underlying statistics of the classification problem. We also present a modification to the learning algorithm which we argue results in convergence of the LVQ for a larger set of initial conditions. Finally, we show that LVQ is a general histogram classifierand that its risk converges to the Bayesian optimal risk as the appropriate parameters go to infinity with the number of past observations.
John S. Baras, Anthony LaVigna
Added 07 Nov 2010
Updated 07 Nov 2010
Type Conference
Year 1990
Where NIPS
Authors John S. Baras, Anthony LaVigna
Comments (0)