Markov Decision Processes (MDPs), currently a popular method for modeling and solving decision theoretic planning problems, are limited by the Markovian assumption: rewards and dy...
This paper presents a real-time vision-based system to assist a person with dementia wash their hands. The system uses only video inputs, and assistance is given as either verbal ...
Jesse Hoey, Pascal Poupart, Axel von Bertoldi, Tam...
Based on the theory of Markov Random Fields, a Bayesian regularization model for diffusion tensor images (DTI) is proposed in this paper. The low-degree parameterization of diffus...
Siming Wei, Jing Hua, Jiajun Bu, Chun Chen, Yizhou...
We present a technique for computing approximately optimal solutions to stochastic resource allocation problems modeled as Markov decision processes (MDPs). We exploit two key pro...
Nicolas Meuleau, Milos Hauskrecht, Kee-Eung Kim, L...
Markov Decision Processes (MDP) have been widely used as a framework for planning under uncertainty. They allow to compute optimal sequences of actions in order to achieve a given...