Sciweavers

ESANN
2006

Diversity creation in local search for the evolution of neural network ensembles

13 years 6 months ago
Diversity creation in local search for the evolution of neural network ensembles
Abstract. The EENCL algorithm [1] automatically designs neural network ensembles for classification, combining global evolution with local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. This paper analyses EENCL, finding that NCL is not an essential component of the algorithm, while implicit fitness sharing is. Furthermore, we find that a local search based on independent training is equally effective in both accuracy and diversity. We propose that NCL is unnecessary in EENCL for the tested datasets, and that complementary diversity in local search and global evolution may lead to better ensembles.
Pete Duell, Iris Fermin, Xin Yao
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2006
Where ESANN
Authors Pete Duell, Iris Fermin, Xin Yao
Comments (0)