Sciweavers

163 search results - page 18 / 33
» Neural networks organizations to learn complex robotic funct...
Sort
View
ICANN
2005
Springer
15 years 5 months ago
Batch-Sequential Algorithm for Neural Networks Trained with Entropic Criteria
The use of entropy as a cost function in the neural network learning phase usually implies that, in the back-propagation algorithm, the training is done in batch mode. Apart from t...
Jorge M. Santos, Joaquim Marques de Sá, Lu&...
JMLR
2002
135views more  JMLR 2002»
14 years 11 months ago
Covering Number Bounds of Certain Regularized Linear Function Classes
Recently, sample complexity bounds have been derived for problems involving linear functions such as neural networks and support vector machines. In many of these theoretical stud...
Tong Zhang
TNN
2010
234views Management» more  TNN 2010»
14 years 6 months ago
Novel maximum-margin training algorithms for supervised neural networks
This paper proposes three novel training methods, two of them based on the back-propagation approach and a third one based on information theory for Multilayer Perceptron (MLP) bin...
Oswaldo Ludwig, Urbano Nunes
IJON
2007
184views more  IJON 2007»
14 years 11 months ago
Convex incremental extreme learning machine
Unlike the conventional neural network theories and implementations, Huang et al. [Universal approximation using incremental constructive feedforward networks with random hidden n...
Guang-Bin Huang, Lei Chen
ML
2006
ACM
110views Machine Learning» more  ML 2006»
14 years 11 months ago
Classification-based objective functions
Backpropagation, similar to most learning algorithms that can form complex decision surfaces, is prone to overfitting. This work presents classification-based objective functions, ...
Michael Rimer, Tony Martinez