Random Walks

Discrete Random walks
all random walks are markov chains, but not all markov chains are random walks.

simple random walk $X_n$ distribution needs to be derived

ruin ruin with limits expected times probability of reverse ruin

example of simple random walk with 2 absorbing barriers

where X = Z + Z + Z show the distribution of X hence find mean and variance find P(Xn =x) find U(s) find P(s)

show E(Xn) and V(Xn)

for paricular p find P(X(n))>x)

Absorbing
win loss critera

end states

gamblers ruin

Reflecting
reset conditions

random walks vs markov chains
markov chains are distinguished from random walks because there is no requirement that the transition probabilities be constant for transitions between particular levels;

The transition probabilities for a random walk are independent of current position, hence random walks are Markov chains with stationary transition probability AND where the transition probability is independent of current position.

give a markov chain that is not a random walk @todo

Gamblers ruin
the simplest case of gamblers ruin is for 2 players A and B, each with a starting pot a and b, to play repeated games of chance, with probability p and q of winning. This is repeated until one of the other player is bust, or "ruined"

unrestricted random walk
why is an unrestricted random walk a markov chain

general random walk on the line
allowing steps of non-integer sizes