Memetic Gradient Search

8 years 9 months ago
Memetic Gradient Search
—This paper reviews the different gradient-based schemes and the sources of gradient, their availability, precision and computational complexity, and explores the benefits of using gradient information within a memetic framework in the context of continuous parameter optimization, which is labeled here as Memetic Gradient Search. In particular, we considered a quasi-Newton method with analytical gradient and finite differencing, as well as simultaneous perturbation stochastic approximation, used as the local searches. Empirical study on the impact of using gradient information showed that Memetic Gradient Search outperformed the traditional GA and analytical, precise gradient brings considerable benefit to gradient-based local search (LS) schemes. Though gradient-based searches can sometimes get trapped in local optima, memetic gradient searches were still able to converge faster than the conventional GA.
Boyang Li, Yew-Soon Ong, Minh Nghia Le, Chi Keong
Added 29 May 2010
Updated 29 May 2010
Type Conference
Year 2008
Where CEC
Authors Boyang Li, Yew-Soon Ong, Minh Nghia Le, Chi Keong Goh
Comments (0)