We shall only consider Markov chains with a finite, but usually very large, state space S = {1, . . . ,n}. An S-valued (discrete-time) stochastic process is a sequence X0,X1,X2, . . . of Svalued random variables over some probability space Ω, i.e. a sequence of (measurable) maps Xt : Ω → S, t = 0,1,2, . . . Such a process is a Markov chain if for all t ≥ 0… (More)