Extended Stochastic Complexity and Minimax Relative Loss Analysis

10 years 7 months ago
Extended Stochastic Complexity and Minimax Relative Loss Analysis
We are concerned with the problem of sequential prediction using a givenhypothesis class of continuously-manyprediction strategies. An e ectiveperformance measure is the minimax relative cumulativeloss (RCL), which is the minimum of the worst-case di erence between the cumulative loss for any prediction algorithm and that for the best assignment in a given hypothesis class. The purpose of this paper is to evaluate the minimax RCL for general continuous hypothesis classes under general losses. We rst derive asymptotical upper and lower bounds on the minimax RCL to show that they match (k=2c)lnm within error of o(lnm) where k is the dimension of parameters for the hypothesis class, m is the sample size, and c is the constant depending on the loss function. We thereby show that the cumulative loss attaining the minimax RCL asymptotically coincides with the extended stochastic complexity (ESC), which is an extension of Rissanen's stochastic complexity (SC) into the decision-theoretic...
Kenji Yamanishi
Added 03 Aug 2010
Updated 03 Aug 2010
Type Conference
Year 1999
Where ALT
Authors Kenji Yamanishi
Comments (0)