Free Online Productivity Tools
i2Speak
i2Symbol
i2OCR
iTex2Img
iWeb2Print
iWeb2Shot
i2Type
iPdf2Split
iPdf2Merge
i2Bopomofo
i2Arabic
i2Style
i2Image
i2PDF
iLatex2Rtf
Sci2ools

QEST

2010

IEEE

2010

IEEE

We consider Markov Decision Processes (MDPs) as transformers on probability distributions, where with respect to a scheduler that resolves nondeterminism, the MDP can be seen as exhibiting a behavior that is a sequence of probability distributions. Defining propositions using linear inequalities, one can reason about executions over the space of probability distributions. In this framework, one can analyze properties that cannot be expressed in logics like PCTL , such as expressing bounds on transient rewards and expected values of random variables, and comparing the probability of being in one set of states at a given time with that of being in another set of states. We show that model checking MDPs with this semantics against -regular properties is in general undecidable. We then identify special classes of propositions and schedulers with respect to which the model checking problem becomes decidable. We demonstrate the potential usefulness of our results through an example.

Related Content

Added |
14 Feb 2011 |

Updated |
14 Feb 2011 |

Type |
Journal |

Year |
2010 |

Where |
QEST |

Authors |
Vijay Anand Korthikanti, Mahesh Viswanathan, Gul Agha, YoungMin Kwon |

Comments (0)