Sciweavers

Share
IJCNN
2006
IEEE

Adaptation of Artificial Neural Networks Avoiding Catastrophic Forgetting

9 years 3 months ago
Adaptation of Artificial Neural Networks Avoiding Catastrophic Forgetting
— In connectionist learning, one relevant problem is “catastrophic forgetting” that may occur when a network, trained with a large set of patterns, has to learn new input patterns, or has to be adapted to a different environment. The risk of catastrophic forgetting is particularly high when a network is adapted with new data that do not adequately represent the knowledge included in the original training data. Two original solutions are proposed to reduce the risk that the network focuses on new data only, loosing its generalization capability. The first one, Conservative Training, is a variant to the target assignment policy, while the second approach, Support Vector Rehearsal, selects from the training set the patterns that lay near the borders of the classes not included in the adaptation set. These patterns are used as sentinels that try to keep unchanged the original boundaries of these classes. Moreover, we investigated the extension of the classical approach consisting in ...
Dario Albesano, Roberto Gemello, Pietro Laface, Fr
Added 11 Jun 2010
Updated 11 Jun 2010
Type Conference
Year 2006
Where IJCNN
Authors Dario Albesano, Roberto Gemello, Pietro Laface, Franco Mana, Stefano Scanzio
Comments (0)
books