Splitting-based importance-sampling algorithm for fast simulation of Markov reliability models with general repair-policies

Abstract

Markov chains with small transition probabilities occur while modeling the reliability of systems where the individual components are highly reliable and quickly repairable. Complex inter-component dependencies can exist and the state space involved can be huge, making these models analytically and numerically intractable. Naive simulation is also difficult because the event of interest (system failure) is rare, so that a prohibitively large amount of computation is needed to obtain samples of these events. An earlier paper [15] proposed an importance sampling scheme that provides large efficiency increases over naive simulation for a very general class of models including reliability models with general repair policies such as deferred and group repairs. However, there is a statistical penalty associated with this scheme when the corresponding Markov chain has high probability cycles as may be the case with reliability models with general repair policies. This paper develops a splitting-based importance-sampling technique that avoids this statistical penalty by splitting paths at high probability cycles and thus achieves “bounded relative-error” in a stronger sense than in [15].

DOI: 10.1109/24.974121

4 Figures and Tables

Cite this paper

@article{Juneja2001SplittingbasedIA, title={Splitting-based importance-sampling algorithm for fast simulation of Markov reliability models with general repair-policies}, author={Sandeep Juneja and Perwez Shahabuddin}, journal={IEEE Trans. Reliability}, year={2001}, volume={50}, pages={235-245} }