Stability of Markovian processes I: criteria for discrete-time Chains

@article{Meyn1992StabilityOM,
  title={Stability of Markovian processes I: criteria for discrete-time Chains},
  author={Sean P. Meyn and Richard L. Tweedie},
  journal={Advances in Applied Probability},
  year={1992},
  volume={24},
  pages={542 - 574}
}
  • S. MeynR. Tweedie
  • Published 1 September 1992
  • Mathematics
  • Advances in Applied Probability
In this paper we connect various topological and probabilistic forms of stability for discrete-time Markov chains. These include tightness on the one hand and Harris recurrence and ergodicity on the other. We show that these concepts of stability are largely equivalent for a major class of chains (chains with continuous components), or if the state space has a sufficiently rich class of appropriate sets (‘petite sets'). We use a discrete formulation of Dynkin's formula to establish unified… 

Stability of Markovian processes II: continuous-time processes and sampled chains

In this paper we extend the results of Meyn and Tweedie (1992b) from discrete-time parameter to continuous-parameter Markovian processes Φ evolving on a topological space. We consider a number of

Stability of Markovian processes III: Foster–Lyapunov criteria for continuous-time processes

In Part I we developed stability concepts for discrete chains, together with Foster–Lyapunov criteria for them to hold. Part II was devoted to developing related stability concepts for

Lyapunov Analysis for Rates of Convergence in Markov Chains and Random-Time State-Dependent Drift

In this paper we survey approaches to studying the ergodicity of aperiodic and irreducible Markov chains [3], [18], [5], [12], [19]. Various results exist for subgeometric and geometric ergodicity

Convergence of Markov Processes August

The aim of this minicourse is to provide a number of tools that allow one to determine at which speed (if at all) the law of a diffusion process, or indeed a rather general Markov process, approaches

Stationary Distributions of Continuous-Time Markov Chains: A Review of Theory and Truncation-Based Approximations

This review reviews truncation-based approximation schemes for CTMCs with infinite state spaces paying particular attention to the schemes' convergence and the errors they introduce, and illustrates their performance with an example of a stochastic reaction network of relevance in biology and chemistry.

Convergence of Markov Processes

The aim of this minicourse is to provide a number of tools that allow one to determine at which speed (if at all) the law of a diffusion process, or indeed a rather general Markov process, approaches

Stability and ergodicity of piecewise deterministic Markov processes

This paper establishes equivalence results regarding recurrence and positive recurrence between a piecewise deterministic Markov process {X(t)} and an embedded discrete-time Markov chain {¿n} generated by a Markov kernel G that can be explicitly characterized in terms of the three local characteristics of the PDMP contrary to the resolvent kernel.

Ergodicity and exponential β-mixing bounds for a strong solution of Lévy-driven stochastic differential equations

We provides the following stability results for strong solution processes of Lévy-driven stochastic differential equations: (i) β-mixing property (absolute regularity) for every initial

Ergodic properties and ergodic decompositions of continuous-time Markov processes

In this paper we obtain some ergodic properties and ergodic decompositions of a continuous-time, Borel right Markov process taking values in a locally compact and separable metric space. Initially,
...

References

SHOWING 1-10 OF 40 REFERENCES

Stability of Markovian processes II: continuous-time processes and sampled chains

In this paper we extend the results of Meyn and Tweedie (1992b) from discrete-time parameter to continuous-parameter Markovian processes Φ evolving on a topological space. We consider a number of

Stability of Markovian processes III: Foster–Lyapunov criteria for continuous-time processes

In Part I we developed stability concepts for discrete chains, together with Foster–Lyapunov criteria for them to hold. Part II was devoted to developing related stability concepts for

Invariant measures for Markov chains with no irreducibility assumptions

  • R. Tweedie
  • Mathematics
    Journal of Applied Probability
  • 1988
Foster's criterion for positive recurrence of irreducible countable space Markov chains is one of the oldest tools in applied probability theory. In various papers in JAP and AAP it has been shown

Markov Chains with Continuous Components

0. Introduction The structure and solidarity properties of general Markov chains satisfying the measure-theoretic condition of 59-irreducibility, for some <p, are now well known (see, for example,

Criteria for classifying general Markov chains

  • R. Tweedie
  • Mathematics
    Advances in Applied Probability
  • 1977
The aim of this paper is to present a comprehensive set of criteria for classifying as recurrent, transient, null or positive the sets visited by a general state space Markov chain. When the chain is

A note on the geometric ergodicity of a Markov chain

  • K. Chan
  • Mathematics
    Advances in Applied Probability
  • 1989
It is known that if an irreducible and aperiodic Markov chain satisfies a ‘drift' condition in terms of a non-negative measurable function g(x), it is geometrically ergodic. See, e.g. Nummelin

On the Stochastic Matrices Associated with Certain Queuing Processes

We shall be concerned with an irreducible Markov chain, which we shall call "the system." For simplicity we shall assume that the system is aperiodic, but this is not essential. The reader is

Ergodic theorems for discrete time stochastic systems using a stochastic lyapunov function

Sufficient conditions are established under which the law of large numbers and related ergodic theorems hold for nonlinear stochastic systems operating under feedback. It is shown that these