Sciweavers

35 search results - page 3 / 7
» Regularization Learning and Early Stopping in Linear Network...
Sort
View
ICML
2004
IEEE
14 years 7 months ago
Links between perceptrons, MLPs and SVMs
We propose to study links between three important classification algorithms: Perceptrons, Multi-Layer Perceptrons (MLPs) and Support Vector Machines (SVMs). We first study ways to...
Ronan Collobert, Samy Bengio
JMLR
2010
145views more  JMLR 2010»
13 years 1 months ago
Kernel Partial Least Squares is Universally Consistent
We prove the statistical consistency of kernel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space. Partial Lea...
Gilles Blanchard, Nicole Krämer
ALT
2008
Springer
14 years 3 months ago
Entropy Regularized LPBoost
In this paper we discuss boosting algorithms that maximize the soft margin of the produced linear combination of base hypotheses. LPBoost is the most straightforward boosting algor...
Manfred K. Warmuth, Karen A. Glocer, S. V. N. Vish...
JMLR
2002
135views more  JMLR 2002»
13 years 5 months ago
Covering Number Bounds of Certain Regularized Linear Function Classes
Recently, sample complexity bounds have been derived for problems involving linear functions such as neural networks and support vector machines. In many of these theoretical stud...
Tong Zhang
JMLR
2012
11 years 8 months ago
Deep Learning Made Easier by Linear Transformations in Perceptrons
We transform the outputs of each hidden neuron in a multi-layer perceptron network to have zero output and zero slope on average, and use separate shortcut connections to model th...
Tapani Raiko, Harri Valpola, Yann LeCun