Sciweavers

374 search results - page 47 / 75
» Training the random neural network using quasi-Newton method...
Sort
View
ICANN
2009
Springer
15 years 6 months ago
Evolving Memory Cell Structures for Sequence Learning
The best recent supervised sequence learning methods use gradient descent to train networks of miniature nets called memory cells. The most popular cell structure seems somewhat ar...
Justin Bayer, Daan Wierstra, Julian Togelius, J&uu...
COLT
1995
Springer
15 years 3 months ago
Regression NSS: An Alternative to Cross Validation
The Noise Sensitivity Signature (NSS), originally introduced by Grossman and Lapedes (1993), was proposed as an alternative to cross validation for selecting network complexity. I...
Michael P. Perrone, Brian S. Blais
JMLR
2012
13 years 2 months ago
Krylov Subspace Descent for Deep Learning
In this paper, we propose a second order optimization method to learn models where both the dimensionality of the parameter space and the number of training samples is high. In ou...
Oriol Vinyals, Daniel Povey
ASUNAM
2010
IEEE
15 years 1 months ago
Semi-Supervised Classification of Network Data Using Very Few Labels
The goal of semi-supervised learning (SSL) methods is to reduce the amount of labeled training data required by learning from both labeled and unlabeled instances. Macskassy and Pr...
Frank Lin, William W. Cohen
ICANN
2010
Springer
14 years 12 months ago
Solving Independent Component Analysis Contrast Functions with Particle Swarm Optimization
Independent Component Analysis (ICA) is a statistical computation method that transforms a random vector in another one whose components are independent. Because the marginal distr...
Jorge Igual, Jehad I. Ababneh, Raul Llinares, Juli...