Sciweavers

ICML
2002
IEEE

Algorithm-Directed Exploration for Model-Based Reinforcement Learning in Factored MDPs

14 years 5 months ago
Algorithm-Directed Exploration for Model-Based Reinforcement Learning in Factored MDPs
One of the central challenges in reinforcement learning is to balance the exploration/exploitation tradeoff while scaling up to large problems. Although model-based reinforcement learning has been less prominent than value-based methods in addressing these challenges, recent progress has generated renewed interest in pursuing modelbased approaches: Theoretical work on the exploration/exploitation tradeoff has yielded provably sound model-based algorithms such as E3 and Rmax, while work on factored MDP representations has yielded model-based algorithms that can scale up to large problems. Recently the benefits of both achievements have been combined in the Factored E3 algorithm of Kearns and Koller. In this paper, we address a significant shortcoming of Factored E3 : namely that it requires an oracle planner that cannot be feasibly implemented. We propose an alternative approach that uses a practical approximate planner, approximate linear programming, that maintains desirable properti...
Carlos Guestrin, Relu Patrascu, Dale Schuurmans
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2002
Where ICML
Authors Carlos Guestrin, Relu Patrascu, Dale Schuurmans
Comments (0)