We consider Markov Decision Processes (MDPs) as transformers on probability distributions, where with respect to a scheduler that resolves nondeterminism, the MDP can be seen as ex...
Vijay Anand Korthikanti, Mahesh Viswanathan, Gul A...
Generic representatives have been proposed for the effective combination of symmetry reduction and symbolic representation with BDDs in non-probabilistic model checking. This appro...
—Supply chain [2], [6] is an important component of business operations. Understanding its stochastic behaviors is the key to risk analysis and performance evaluation in supply c...
Currently, the statistical framework based on Hidden Markov Models (HMMs) plays a relevant role in speech synthesis, while voice conversion systems based on Gaussian Mixture Model...
We develop a hierarchical approach to planning for partially observable Markov decision processes (POMDPs) in which a policy is represented as a hierarchical finite-state control...