Shepherding Hordes of Markov Chains

@inproceedings{Ceska2019ShepherdingHO,
  title={Shepherding Hordes of Markov Chains},
  author={Milan Ceska and N. Jansen and Sebastian Junges and Joost-Pieter Katoen},
  booktitle={TACAS},
  year={2019}
}
This paper considers large families of Markov chains (MCs) that are defined over a set of parameters with finite discrete domains. Such families occur in software product lines, planning under partial observability, and sketching of probabilistic programs. Simple questions, like ‘does at least one family member satisfy a property?’, are NP-hard. We tackle two problems: distinguish family members that satisfy a given quantitative property from those that do not, and determine a family member… 
On the Complexity of Reachability in Parametric Markov Decision Processes
TLDR
The complexity of finding values for parameters such that the induced MDP satisfies some reachability constraints is studied, and all known lower bounds are improved and ETR-completeness results for distinct variants of this problem are provided.
Parameter Synthesis in Markov Models: A Gentle Survey
TLDR
The main ideas underlying state-of-the-art algorithms that established an impressive leap over the last decade enabling the fully automated analysis of models with millions of states and thousands of parameters are described.
Counterexample-guided inductive synthesis for probabilistic systems
TLDR
The crux is to aggressively prune the search space by using counterexamples provided by a Probabilistic model checker to automatically synthesise probabilistic models.
Inductive Synthesis for Probabilistic Programs Reaches New Horizons
TLDR
A novel inductive oracle that greedily generates counter-examples for violating programs and uses them to prune the family to provide a significantly faster and more effective pruning strategy leading to an accelerated synthesis process on a wide range of benchmarks.
Farkas Certificates and Minimal Witnesses for Probabilistic Reachability Constraints
TLDR
Farkas certificates for lower and upper bounds on minimal and maximal reachability probabilities in Markov decision processes (MDP) are introduced using an MDP-variant of Farkas’ Lemma.
Structured Synthesis for Probabilistic Systems
TLDR
A transformation of models specified in the PRISM probabilistic programming language creates models that account for all possible system configurations by nondeterministic choices, which enables the use of optimized tools for model checking in a black-box fashion.
Model Repair Revamped: On the Automated Synthesis of Markov Chains
This paper outlines two approaches—based on counterexampleguided abstraction refinement (CEGAR) and counterexample-guided inductive synthesis (CEGIS), respectively—to the automated synthesis of
Model Repair Revamped - - On the Automated Synthesis of Markov Chains -
TLDR
Two approaches to the automated synthesis of finite-state probabilistic models and programs based on counterexample-guided abstraction refinement (CEGAR andCEGIS) are outlined and the applicability of these synthesis techniques to sketching of Probabilistic programs, controller synthesis of POMDPs, and software product lines is shown.
Finding Provably Optimal Markov Chains
TLDR
This paper proposes to tackle ETR-hard problems by a tight combination of two significantly different techniques: monotonicity checking and parameter lifting, which is an abstraction technique based on the iterative evaluation of pMCs without parameter dependencies.
...
...

References

SHOWING 1-10 OF 45 REFERENCES
Synthesis in pMDPs: A Tale of 1001 Parameters
TLDR
It is shown that the synthesis problem for parametric Markov decision processes whose transitions are equipped with affine functions over a finite set of parameters can be formulated as a quadratically-constrained quadratic program (QCQP) and is non-convex in general.
PROPhESY: A PRObabilistic ParamEter SYnthesis Tool
TLDR
ProPhESY, a tool for analyzing parametric Markov chains (MCs), can compute a rational function (i.e., a fraction of two polynomials in the model parameters) for reachability and expected reward objectives and supports the novel feature of conditional probabilities.
Reachability in Augmented Interval Markov Chains
TLDR
A generalisation of the familiar interval Markov chains where uncertain transition probabilities are in addition allowed to depend on one another, which preserves the flexibility afforded by IMCs for describing stochastic systems where the parameters are unclear, but also allows to specify transitions with probabilities known to be identical, thereby lending further expressivity.
Model Repair for Probabilistic Systems
TLDR
Using a new version of parametric probabilistic model checking, it is shown how the Model Repair problem can be reduced to a nonlinear optimization problem with a minimal-cost objective function, thereby yielding a solution technique.
Symbolic counterexample generation for large discrete-time Markov chains
Model Repair for Markov Decision Processes
TLDR
This paper first formulate a region-based approach, which yields an interval in which the minimal repair cost is contained, and also considers sampling based approaches, which are faster but unable to provide lower bounds on the repair cost.
Parameter-Independent Strategies for pMDPs via POMDPs
TLDR
This work studies for the first time computing parameter-independent strategies that are expectation optimal, i.e., optimize the expected reachability probability under the probability distribution over the parameters in parametric MDPs.
A Symbolic SAT-Based Algorithm for Almost-Sure Reachability with Small Strategies in POMDPs
TLDR
This work first studies the existence of observation-stationary strategies, which is NP-complete, and then small-memory strategies, and presents a symbolic algorithm by an efficient encoding to SAT and using a SAT solver for the problem of almost-sure reachability.
...
...