Chapter 3 Markov Chains

    Abstract

    Markov chains are the simplest examples among stochastic processes, i.e. random variables that evolve in time. Markov chains are relatively simple because the random variable is discrete and time is discrete as well. More importantly, Markov chain (and for that matter Markov processes in general) have the basic property that their future evolution is determine by their state at present time does not depend on their past.

    5 Figures and Tables

    Cite this paper

    @inproceedings{Chapter3M, title={Chapter 3 Markov Chains}, author={} }