Markov Chains

Markov chains, named of for Andrei Andreyevich Markov, see main article

= The Markov Property =

The Markov property can be looked upon as the idea that probably of transition to the next state is only dependent on the current state, and not on the history of the process.

Examples
the probability of the roulette wheel showing red, after a run of 11 reds, is independent of that history.

the likelihood of the direction of a gas molecule???

= description =

For a sequence of variables $X_1, X_2, X_3, X_t+1$ where the index is called t denote time rather than (n for example)

A Markov chain is a sequence in which the value of $X_{t+1}$ depends only on X_t and not on the past$ X_t-1, X_t-2$ etc

"Given the present, the future is independent of the past."

The values taken by the variables are known as states, and the possible values they can take lie in the state space S.

Discrete Markov Chains
many of the concepts required for Markov chains are easier to describe with discrete Markov chain

state changes are a good example of discrete Markov Chains, for example when an action today depends only on the action yesterday.

Continuous Markov chains
random walk.

autoregressive model

Markov chains used in MCMC for Bayesian inference actually almost always involve continuous Markov chains

Transition distributions
P(state 1 at t|state 0 at 0) P(state 2 at t2|state 1 at 1 and state 0 at 0) classify states, (closed not close, recurrent not reccurent, transient) identify unique stationary distribution

identify the assumptions of using a markov chain to model some process

find equilibtrium vector from transition matrix

calculate aP and aPn from initial vector

find expected return times to states in markov chains

n-step transition probability
$p_{i,j}^{(n)} = \sum_{k \in S} p_{i,k} p_{k,j} $

Chapman-Kolmogorov equations