Part I Markov Chains and Stochastic Sampling 1 Markov Chains and Random Walks on Graphs 1.1 Structure of Finite Markov Chains

  • Published 2007

Abstract

We shall only consider Markov chains with a finite, but usually very large, state space S = {1, . . . ,n}. An S-valued (discrete-time) stochastic process is a sequence X0,X1,X2, . . . of Svalued random variables over some probability space Ω, i.e. a sequence of (measurable) maps Xt : Ω → S, t = 0,1,2, . . . Such a process is a Markov chain if for all t ≥ 0… (More)

Topics

3 Figures and Tables

Slides referencing similar topics