Sciweavers

TNN
1998

Asymptotic distributions associated to Oja's learning equation for neural networks

13 years 4 months ago
Asymptotic distributions associated to Oja's learning equation for neural networks
— In this paper, we perform a complete asymptotic performance analysis of the stochastic approximation algorithm (denoted subspace network learning algorithm) derived from Oja’s learning equation, in the case where the learning rate is constant and a large number of patterns is available. This algorithm drives the connection weight matrix WWW to an orthonormal basis of a dominant invariant subspace of a covariance matrix. Our approach consists in associating to this algorithm a second stochastic approximation algorithm that governs the evolution of WWWWWWT to the projection matrix onto this dominant invariant subspace. Then, using a general result of Gaussian approximation theory, we derive the asymptotic distribution of the estimated projection matrix. Closed form expressions of the asymptotic covariance of the projection matrix estimated by the SNL algorithm, and by the smoothed SNL algorithm that we introduce, are given in case of independent or correlated learning patterns and ...
Jean Pierre Delmas, Jean-Francois Cardos
Added 23 Dec 2010
Updated 23 Dec 2010
Type Journal
Year 1998
Where TNN
Authors Jean Pierre Delmas, Jean-Francois Cardos
Comments (0)