Sciweavers

ESANN
2000
13 years 5 months ago
A new information criterion for the selection of subspace models
The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this paper, we propose a new criterion ...
Masashi Sugiyama, Hidemitsu Ogawa
GECCO
2003
Springer
153views Optimization» more  GECCO 2003»
13 years 9 months ago
SEPA: Structure Evolution and Parameter Adaptation in Feed-Forward Neural Networks
Abstract. In developing algorithms that dynamically changes the structure and weights of ANN (Artificial Neural Networks), there must be a proper balance between network complexit...
Paulito P. Palmes, Taichi Hayasaka, Shiro Usui
IJCNN
2006
IEEE
13 years 10 months ago
Generalization Improvement in Multi-Objective Learning
— Several heuristic methods have been suggested for improving the generalization capability in neural network learning, most of which are concerned with a single-objective (SO) l...
Lars Gräning, Yaochu Jin, Bernhard Sendhoff
IJCNN
2006
IEEE
13 years 10 months ago
Adaptation of Artificial Neural Networks Avoiding Catastrophic Forgetting
— In connectionist learning, one relevant problem is “catastrophic forgetting” that may occur when a network, trained with a large set of patterns, has to learn new input pat...
Dario Albesano, Roberto Gemello, Pietro Laface, Fr...