Sciweavers

93 search results - page 1 / 19
» Evolving neural networks in compressed weight space
Sort
View
GECCO
2010
Springer
181views Optimization» more  GECCO 2010»
13 years 9 months ago
Evolving neural networks in compressed weight space
We propose a new indirect encoding scheme for neural networks in which the weight matrices are represented in the frequency domain by sets of Fourier coefficients. This scheme exp...
Jan Koutnik, Faustino J. Gomez, Jürgen Schmid...
GECCO
2005
Springer
175views Optimization» more  GECCO 2005»
13 years 10 months ago
Nonlinear feature extraction using a neuro genetic hybrid
Feature extraction is a process that extracts salient features from observed variables. It is considered a promising alternative to overcome the problems of weight and structure o...
Yung-Keun Kwon, Byung Ro Moon
ICPR
2008
IEEE
14 years 5 months ago
Unsupervised design of Artificial Neural Networks via multi-dimensional Particle Swarm Optimization
In this paper, we present a novel and efficient approach for automatic design of Artificial Neural Networks (ANNs) by evolving to the optimal network configuration(s) within an ar...
E. Alper Yildirim, Ince Turker, Moncef Gabbouj, Se...
GECCO
2004
Springer
134views Optimization» more  GECCO 2004»
13 years 10 months ago
A Descriptive Encoding Language for Evolving Modular Neural Networks
Evolutionary algorithms are a promising approach for the automated design of artificial neural networks, but they require a compact and efficient genetic encoding scheme to repres...
Jae-Yoon Jung, James A. Reggia
CEC
2008
IEEE
13 years 11 months ago
Learning what to ignore: Memetic climbing in topology and weight space
— We present the memetic climber, a simple search algorithm that learns topology and weights of neural networks on different time scales. When applied to the problem of learning ...
Julian Togelius, Faustino J. Gomez, Jürgen Sc...