Sciweavers

JMLR   2006
Wall of Fame | Most Viewed JMLR-2006 Paper
JMLR
2006
389views more  JMLR 2006»
13 years 4 months ago
A Very Fast Learning Method for Neural Networks Based on Sensitivity Analysis
This paper introduces a learning method for two-layer feedforward neural networks based on sensitivity analysis, which uses a linear training algorithm for each of the two layers....
Enrique Castillo, Bertha Guijarro-Berdiñas,...
Disclaimer and Copyright Notice
Sciweavers respects the rights of all copyright holders and in this regard, authors are only allowed to share a link to their preprint paper on their own website. Every contribution is associated with a desciptive image. It is the sole responsibility of the authors to ensure that their posted image is not copyright infringing. This service is compliant with IEEE copyright.
IdReadViewsTitleStatus
1Download preprint from source389
2Download preprint from source206
3Download preprint from source190
4Download preprint from source186
5Download preprint from source175
6Download preprint from source169
7Download preprint from source156
8Download preprint from source153
9Download preprint from source150
10Download preprint from source150
11Download preprint from source148
12Download preprint from source148
13Download preprint from source147
14Download preprint from source145
15Download preprint from source143
16Download preprint from source143
17Download preprint from source143
18Download preprint from source140
19Download preprint from source138
20Download preprint from source137
21Download preprint from source136
22Download preprint from source135
23Download preprint from source135
24Download preprint from source134
25Download preprint from source132
26Download preprint from source132
27Download preprint from source131
28Download preprint from source131
29Download preprint from source125
30Download preprint from source125
31Download preprint from source125
32Download preprint from source125
33Download preprint from source124
34Download preprint from source124
35Download preprint from source124
36Download preprint from source123
37Download preprint from source120
38Download preprint from source120
39Download preprint from source118
40Download preprint from source117
41Download preprint from source116
42Download preprint from source116
43Download preprint from source115
44Download preprint from source113
45Download preprint from source113
46Download preprint from source113
47Download preprint from source112
48Download preprint from source109
49Download preprint from source108
50Download preprint from source108
51Download preprint from source107
52Download preprint from source107
53Download preprint from source106
54Download preprint from source105
55Download preprint from source105
56Download preprint from source105
57Download preprint from source105
58Download preprint from source104
59Download preprint from source103
60Download preprint from source103
61Download preprint from source103
62Download preprint from source103
63Download preprint from source100
64Download preprint from source99
65Download preprint from source97
66Download preprint from source97
67Download preprint from source96
68Download preprint from source96
69Download preprint from source93
70Download preprint from source92
71Download preprint from source91
72Download preprint from source90
73Download preprint from source89
74Download preprint from source89
75Download preprint from source87
76Download preprint from source85
77Download preprint from source85
78Download preprint from source82
79Download preprint from source80
80Download preprint from source80
81Download preprint from source79
82Download preprint from source79
83Download preprint from source78
84Download preprint from source78
85Download preprint from source62
86Download preprint from source61