Sciweavers

IJCNN
2006
IEEE

C2FS: An Algorithm for Feature Selection in Cascade Neural Networks

13 years 10 months ago
C2FS: An Algorithm for Feature Selection in Cascade Neural Networks
Wrapper-based feature selection is attractive because wrapper methods are able to optimize the features they select to the specific learning algorithm. Unfortunately, wrapper methods are prohibitively expensive to use with neural nets. We present an internal wrapper feature selection method for Cascade Correlation (C2) nets called C2FS that is 23 orders of magnitude faster than external wrapper feature selection. This new internal wrapper feature selection method selects features at the same time hidden units are being added to the growing C2 net architecture. Experiments with five test problems show that C2FS feature selection usually improves accuracy and squared error while dramatically reducing the number of features needed for good performance. Comparison to feature selection via an information theoretic ordering on features (gain ratio) shows that C2FS usually yields better performance and always uses substantially fewer features.
Lars Backstrom, Rich Caruana
Added 11 Jun 2010
Updated 11 Jun 2010
Type Conference
Year 2006
Where IJCNN
Authors Lars Backstrom, Rich Caruana
Comments (0)