Sciweavers

ECML
2006
Springer
15 years 17 days ago
Bandit Based Monte-Carlo Planning
Abstract. For large state-space Markovian Decision Problems MonteCarlo planning is one of the few viable approaches to find near-optimal solutions. In this paper we introduce a new...
Levente Kocsis, Csaba Szepesvári