Sciweavers

Share
CORR
2007
Springer

Algorithmic Complexity Bounds on Future Prediction Errors

9 years 1 months ago
Algorithmic Complexity Bounds on Future Prediction Errors
We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution by the algorithmic complexity of . Here we assume that we are at a time t>1 and have already observed x=x1···xt. We bound the future prediction performance on xt+1xt+2··· by a new variant of algorithmic complexity of given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems. © 2006 Elsevier Inc. All rights reserved.
Alexey V. Chernov, Marcus Hutter, Jürgen Schm
Added 13 Dec 2010
Updated 13 Dec 2010
Type Journal
Year 2007
Where CORR
Authors Alexey V. Chernov, Marcus Hutter, Jürgen Schmidhuber
Comments (0)
books