Sciweavers

Share
BMCBI
2006

Noise-injected neural networks show promise for use on small-sample expression data

11 years 5 months ago
Noise-injected neural networks show promise for use on small-sample expression data
Background: Overfitting the data is a salient issue for classifier design in small-sample settings. This is why selecting a classifier from a constrained family of classifiers, ones that do not possess the potential to too finely partition the feature space, is typically preferable. But overfitting is not merely a consequence of the classifier family; it is highly dependent on the classification rule used to design a classifier from the sample data. Thus, it is possible to consider families that are rather complex but for which there are classification rules that perform well for small samples. Such classification rules can be advantageous because they facilitate satisfactory classification when the class-conditional distributions are not easily separated and the sample is not large. Here we consider neural networks, from the perspectives of classical design based solely on the sample data and from noise-injection-based design. Results: This paper provides an extensive simulation-base...
Jianping Hua, James Lowey, Zixiang Xiong, Edward R
Added 10 Dec 2010
Updated 10 Dec 2010
Type Journal
Year 2006
Where BMCBI
Authors Jianping Hua, James Lowey, Zixiang Xiong, Edward R. Dougherty
Comments (0)
books