Sciweavers

EUROCOLT
1997
Springer

Vapnik-Chervonenkis Dimension of Recurrent Neural Networks

13 years 8 months ago
Vapnik-Chervonenkis Dimension of Recurrent Neural Networks
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimension of such networks. Several types of activation functions are discussed, including threshold, polynomial, piecewisepolynomial and sigmoidal functions. The bounds depend on two independent parameters: the number w of weights in the network, and the length k of the input sequence. In contrast, for feedforward networks, VC dimension bounds can be expressed as a function of w only. An important difference between recurrent and feedforward nets is that a fixed recurrent net can receive inputs of arbitrary length. Therefore we are particularly interested in the case k w. Ignoring multiplicative constants, the main results say roughly the following: • For architectures with activation σ = an...
Pascal Koiran, Eduardo D. Sontag
Added 07 Aug 2010
Updated 07 Aug 2010
Type Conference
Year 1997
Where EUROCOLT
Authors Pascal Koiran, Eduardo D. Sontag
Comments (0)