Sciweavers

NIPS
1989

The Cascade-Correlation Learning Architecture

13 years 5 months ago
The Cascade-Correlation Learning Architecture
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology, Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network. This research was sponsored in part by the National Science Foundation under Contract Number EET-871...
Scott E. Fahlman, Christian Lebiere
Added 07 Nov 2010
Updated 07 Nov 2010
Type Conference
Year 1989
Where NIPS
Authors Scott E. Fahlman, Christian Lebiere
Comments (0)