Sciweavers

AI
1998
Springer

Worst-Case Analysis of the Perceptron and Exponentiated Update Algorithms

13 years 3 months ago
Worst-Case Analysis of the Perceptron and Exponentiated Update Algorithms
The absolute loss is the absolute difference between the desired and predicted outcome. This paper demonstrates worst-case upper bounds on the absolute loss for the Perceptron learning algorithm and the Exponentiated Update learning algorithm, which is related to the Weighted Majority algorithm. The bounds characterize the behavior of the algorithms over any sequence of trials, where each trial consists of an example and a desired outcome interval (any value in the interval is an acceptable outcome). The worst-case absolute loss of both algorithms is bounded by: the absolute loss of the best linear function in a comparison class, plus a constant dependent on the initial weight vector, plus a per-trial loss. The per-trial loss can be eliminated if the learning algorithm is allowed a tolerance from the desired outcome. For concept learning, the worst-case bounds lead to mistake bounds that are comparable to past results. ∗ This paper is a revised and extended version of Bylander [4]....
Tom Bylander
Added 21 Dec 2010
Updated 21 Dec 2010
Type Journal
Year 1998
Where AI
Authors Tom Bylander
Comments (0)