Partially Observable Markov Decision Processes (POMDPs) are a well-established and rigorous framework for sequential decision-making under uncertainty. POMDPs are well-known to be...
In many settings, a group of agents must come to a joint decision on multiple issues. In practice, this is often done by voting on the issues in sequence. In this paper, we model ...
In previous work [8] we presented a casebased approach to eliciting and reasoning with preferences. A key issue in this approach is the definition of similarity between user prefe...
Consider the problem of protecting endangered species by selecting patches of land to be used for conservation purposes. Typically, the availability of patches changes over time, ...
Daniel Golovin, Andreas Krause, Beth Gardner, Sara...
Markov Decision Processes are a powerful framework for planning under uncertainty, but current algorithms have difficulties scaling to large problems. We present a novel probabil...