Sciweavers

641 search results - page 11 / 129
» Training Methods for Adaptive Boosting of Neural Networks
Sort
View
117
Voted
IJIT
2004
15 years 1 months ago
A Comparison of First and Second Order Training Algorithms for Artificial Neural Networks
Minimization methods for training feed-forward networks with Backpropagation are compared. Feedforward network training is a special case of functional minimization, where no expli...
Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, C...
99
Voted
ISBI
2009
IEEE
15 years 7 months ago
Automatic Markup of Neural Cell Membranes Using Boosted Decision Stumps
To better understand the central nervous system, neurobiologists need to reconstruct the underlying neural circuitry from electron microscopy images. One of the necessary tasks is...
Kannan Umadevi Venkataraju, António R. C. P...
127
Voted
IJCAI
2007
15 years 2 months ago
Simple Training of Dependency Parsers via Structured Boosting
Recently, significant progress has been made on learning structured predictors via coordinated training algorithms such as conditional random fields and maximum margin Markov ne...
Qin Iris Wang, Dekang Lin, Dale Schuurmans
NECO
2006
157views more  NECO 2006»
15 years 14 days ago
Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression
The application of boosting technique to the regression problems has received relatively little attention in contrast to the research aimed at classification problems. This paper ...
Durga L. Shrestha, Dimitri P. Solomatine
578
Voted

Tutorial
3234views
15 years 7 months ago
Nguyen-Widrow and other Neural Network Weight/Threshold Initialization Methods
Neural networks learn by adjusting numeric values called weights and thresholds. A weight specifies how strong of a connection exists between two neurons. A threshold is a value,...
Jeff Heaton