Sciweavers

STOC
1993
ACM

Bounds for the computational power and learning complexity of analog neural nets

13 years 9 months ago
Bounds for the computational power and learning complexity of analog neural nets
Abstract. It is shown that high-order feedforward neural nets of constant depth with piecewisepolynomial activation functions and arbitrary real weights can be simulated for Boolean inputs and outputs by neural nets of a somewhat larger size and depth with Heaviside gates and weights from {−1, 0, 1}. This provides the first known upper bound for the computational power of the former type of neural nets. It is also shown that in the case of first-order nets with piecewise-linear activation functions one can replace arbitrary real weights by rational numbers with polynomially many bits without changing the Boolean function that is computed by the neural net. In order to prove these results, we introduce two new methods for reducing nonlinear problems about weights in multilayer neural nets to linear problems for a transformed set of parameters. These transformed parameters can be interpreted as weights in a somewhat larger neural net. As another application of our new proof technique...
Wolfgang Maass
Added 10 Aug 2010
Updated 10 Aug 2010
Type Conference
Year 1993
Where STOC
Authors Wolfgang Maass
Comments (0)