Sciweavers

28 search results - page 1 / 6
» Using the BSP Cost Model to Optimise Parallel Neural Network...
Sort
View
IPPS
1998
IEEE
13 years 9 months ago
Using the BSP Cost Model to Optimise Parallel Neural Network Training
We derive cost formulae for three di erent parallelisation techniques for training supervised networks. These formulae are parameterised by properties of the target computer archit...
R. O. Rogers, David B. Skillicorn
NIPS
1998
13 years 6 months ago
Global Optimisation of Neural Network Models via Sequential Sampling
We propose a novel strategy for training neural networks using sequential Monte Carlo algorithms. This global optimisation strategy allows us to learn the probability distribution...
João F. G. de Freitas, Mahesan Niranjan, Ar...
GLOBECOM
2007
IEEE
13 years 11 months ago
Minimizing Distribution Cost of Distributed Neural Networks in Wireless Sensor Networks
Abstract—This paper presents a novel study on how to distribute neural networks in a wireless sensor networks (WSNs) such that the energy consumption is minimized while improving...
Peng Guan, Xiaolin Li
FLAIRS
2008
13 years 7 months ago
Small Models of Large Machines
In this paper, we model large support vector machines (SVMs) by smaller networks in order to decrease the computational cost. The key idea is to generate additional training patte...
Pramod Lakshmi Narasimha, Sanjeev S. Malalur, Mich...
INFORMATICALT
2007
123views more  INFORMATICALT 2007»
13 years 4 months ago
Design and Implementation of Parallel Counterpropagation Networks Using MPI
The objective of this research is to construct parallel models that simulate the behavior of artificial neural networks. The type of network that is simulated in this project is t...
Athanasios Margaris, Stavros Souravlas, Efthimios ...