Sciweavers

ECML
2006
Springer
13 years 8 months ago
Bandit Based Monte-Carlo Planning
Abstract. For large state-space Markovian Decision Problems MonteCarlo planning is one of the few viable approaches to find near-optimal solutions. In this paper we introduce a new...
Levente Kocsis, Csaba Szepesvári