Sciweavers

374 search results - page 8 / 75
» Training the random neural network using quasi-Newton method...
Sort
View
ESANN
2003
15 years 1 months ago
Accelerating the convergence speed of neural networks learning methods using least squares
In this work a hybrid training scheme for the supervised learning of feedforward neural networks is presented. In the proposed method, the weights of the last layer are obtained em...
Oscar Fontenla-Romero, Deniz Erdogmus, José...
TNN
2010
234views Management» more  TNN 2010»
14 years 6 months ago
Novel maximum-margin training algorithms for supervised neural networks
This paper proposes three novel training methods, two of them based on the back-propagation approach and a third one based on information theory for Multilayer Perceptron (MLP) bin...
Oswaldo Ludwig, Urbano Nunes
ISNN
2010
Springer
14 years 10 months ago
Pruning Training Samples Using a Supervised Clustering Algorithm
As practical pattern classification tasks are often very-large scale and serious imbalance such as patent classification, using traditional pattern classification techniques in ...
Minzhang Huang, Hai Zhao, Bao-Liang Lu
ICANN
2009
Springer
15 years 3 months ago
Profiling of Mass Spectrometry Data for Ovarian Cancer Detection Using Negative Correlation Learning
This paper proposes a novel Mass Spectrometry data profiling method for ovarian cancer detection based on negative correlation learning (NCL). A modified Smoothed Nonlinear Energy ...
Shan He, Huanhuan Chen, Xiaoli Li, Xin Yao
ICANN
2005
Springer
15 years 5 months ago
Batch-Sequential Algorithm for Neural Networks Trained with Entropic Criteria
The use of entropy as a cost function in the neural network learning phase usually implies that, in the back-propagation algorithm, the training is done in batch mode. Apart from t...
Jorge M. Santos, Joaquim Marques de Sá, Lu&...