Scalable Verification of Markov Decision Processes

@inproceedings{Legay2014ScalableVO,
  title={Scalable Verification of Markov Decision Processes},
  author={Axel Legay and Sean Sedwards and Louis-Marie Traonouez},
  booktitle={SEFM Workshops},
  year={2014}
}
Markov decision processes (MDP) are useful to model concurrent process optimisation problems, but verifying them with numerical methods is often intractable. Existing approximative approaches do not scale well and are limited to memoryless schedulers. Here we present the basis of scalable verification for MDPSs, using an O(1) memory representation of history-dependent schedulers. We thus facilitate scalable learning techniques and the use of massively parallel verification.