Sciweavers

COLT
2001
Springer

Tracking a Small Set of Experts by Mixing Past Posteriors

13 years 9 months ago
Tracking a Small Set of Experts by Mixing Past Posteriors
In this paper, we examine on-line learning problems in which the target concept is allowed to change over time. In each trial a master algorithm receives predictions from a large set of n experts. Its goal is to predict almost as well as the best sequence of such experts chosen o¬-line by partitioning the training sequence into k + 1 sections and then choosing the best expert for each section. We build on methods developed by Herbster and Warmuth and consider an open problem posed by Freund where the experts in the best partition are from a small pool of size m. Since k ¾ m, the best expert shifts back and forth between the experts of the small pool. We propose algorithms that solve this open problem by mixing the past posteriors maintained by the master algorithm. We relate the number of bits needed for encoding the best partition to the loss bounds of the algorithms. Instead of paying log n for choosing the best expert in each section we ­ rst pay log ¡n m ¢ bits in the bounds ...
Olivier Bousquet, Manfred K. Warmuth
Added 28 Jul 2010
Updated 28 Jul 2010
Type Conference
Year 2001
Where COLT
Authors Olivier Bousquet, Manfred K. Warmuth
Comments (0)