Description:

  • A Memoryless Variable Stochastic Process
  • Like a Successor Function of a Stochastic Process
  • Suppose whenever the process is in state , there is a fixed probability that will take it from state to state . .
    • That is for all states and
    • Meaning
  • The future state only concern with the current state, it is independent from the past states
    • Must hold true for any possible state

As a matrix:

  • Represented with a Matrix of one-step each with probability
    • ex:
    • Sum of a row is always as 1 state must have an outcome
  • Should be square matrix with possible state

Transforming a process to a Markov Chain:

  • If a future state depends on the last 2 states, combine the last two states into 1 combined state more nb of states
  • However, the nb of states as result must be the same, so we need to combine with the current (or more before if needed)
  • ex: Rain yesterday, rain today to predict raining tomorrow
    • X00: rain ytd + rain today
    • X01: rain ytd + rain today
    • so the resulting X00 becomes rain today + rain tomorrow X 00-00 = P(rain+rain+rain)

Chapman-Kolmogorov Equation

State in Markov Chain

Reducing Markov chain:

  • Irreducible:
    • means there is only 1 class (1 state can move to every other states)
  • Reducible meaning?

Long-run Proportion of Markov Chain

Limiting probability:

  • which is equal to Long-run Proportion of Markov Chain if exist
  • Therefore for big steps, state of any given step doesnt depend on the intial step
  • except for Periodic Markov Chain
    • ex: , as doesnt converge to a long-run proportion for each state
    • Does thave limiting probability
    • Can only return to a state in an exact steps
      • in this case, 2 steps
    • State at any steps depends on the first state

Some applications:

Time Reversible Markov Chain

  • Consider an Ergodic Markov Chain having transition probabilities Pij and stationary probabilities
    • A reversible Markov chain is like having a “time machine” for this little world. It considers the process to be “fair” in both directions.
  • We trace the sequence of states going backward in time, that is, starting at time , consider the sequence of states
  • This sequence of states is itself a Markov chain with transition probabilities defined by
    • Opposite of
  • The reverse process is also a Markov Chain
  • Markov chain is time reversible if for all
    • Equivalently for all
    • for all states and , the rate at which the process goes from to is equal to the rate at which it goes from to
  • We can model undirected connected graph with time reversible markov chain:
    • If at any time the particle resides at node , then it will next move to node with probability where
      • where is 1 if there is an arc, 0 otherwise
      • meaning probability from to is 1/number of arc connected from i
    • Using combining with and
      • we have
  • theorem A Stationary Markov Chain for which whenever is time reversible if and only if starting in state , any path back to has the same probability as the reversed path.
    • That is, if for all state
  • Consider an irreducible Markov chain with transition probabilities .
    • If we can find positive numbers , summing to one, and a transition probability matrix such that
    • then the are the transition probabilities of the reversed chain and the are the stationary probabilities both for the original and reversed chain
      • ie. double time reversed MC