6.895 Randomness and Computation 1 Random Walks 1.1 Markov Chains 1.2 Random Walk on a Graph

1.1 Markov Chains Let Ω be a set of states (for the purposes of this class, Ω is always finite, so we can think of it as nodes in a graph). A Markov chain is a sequence of random variables X0, X1, . . . , Xt ∈ Ω that obey the “Markovian property”, that is, Pr[Xt+1 = y|X0 = x0, X1 = x1, . . . , Xt = xt] = Pr[Xt+1 = y|Xt = xt]. One can think of Xi’s as states… CONTINUE READING