Sciweavers

CEC
2011
IEEE

Stochastic Natural Gradient Descent by estimation of empirical covariances

12 years 4 months ago
Stochastic Natural Gradient Descent by estimation of empirical covariances
—Stochastic relaxation aims at finding the minimum of a fitness function by identifying a proper sequence of distributions, in a given model, that minimize the expected value of the fitness function. Different algorithms fit this framework, and they differ according to the policy they implement to identify the next distribution in the model. In this paper we present two algorithms, in the stochastic relaxation framework, for the optimization of real-valued functions defined over binary variables: Stochastic Gradient Descent (SGD) and Stochastic Natural Gradient Descent (SNDG). These algorithms use a stochastic model to sample from as it happens for Estimation of Distribution Algorithms (EDAs), but the estimation of the model from the population is substituted by the direct update of model parameter through stochastic gradient descent. The two algorithms, SGD and SNDG, both use statistical models in the exponential family, but they differ in the use of the natural gradient, firs...
Luigi Malagò, Matteo Matteucci, Giovanni Pi
Added 13 Dec 2011
Updated 13 Dec 2011
Type Journal
Year 2011
Where CEC
Authors Luigi Malagò, Matteo Matteucci, Giovanni Pistone
Comments (0)