Markov Chains

Chapter 2 • Section 2-9

"Predicting the future, one probability at a time! 🔮"

🎲 The Setup

A Markov Chain models a system that jumps between states over time.

State Vector ($S_k$)

A vector of probabilities summing to 1. Represents the state at step $k$.

Transition Matrix ($P$)

Contains probabilities of moving from state $j$ to state $i$. Columns sum to 1 (Stochastic).

$$ S_{k+1} = P S_k $$

☀️ Weather Simulator

Let's model the weather!
States: Sunny (0) and Rainy (1).
If Sunny, 80% chance next day is Sunny.
If Rainy, 60% chance next day is Rainy.

$$ P = \begin{bmatrix} 0.8 & 0.4 \\ 0.2 & 0.6 \end{bmatrix} $$

Current Day

☀️

Sunny

➡️

Day Count

0

Notice: Over time, it settles into a pattern!

The Steady State

Does the system settle down? Often, yes!

Finding the Equilibrium

We look for a vector $S$ that doesn't change:

$$ PS = S \implies (I - P)S = \vec{0} $$

This is just finding the eigenvector for $\lambda = 1$! Don't forget to scale it so entries sum to 1.

Regular Matrices

A stochastic matrix $P$ is regular if some power $P^k$ contains only strictly positive entries.

Theorem: If $P$ is regular, a unique steady state vector exists.

Example: The Wolf Pack

A wolf pack hunts in 3 regions ($R_1, R_2, R_3$). Transition matrix:

$$ P = \begin{bmatrix} 0.5 & 0.25 & 0.25 \\ 0 & 0.5 & 0.25 \\ 0.5 & 0.25 & 0.5 \end{bmatrix} $$

Is it regular? Yes, calculate $P^2$ and you'll see all entries are positive.

Steady State $S$ (where $PS=S$):

$$ S = \begin{bmatrix} 3/9 \\ 2/9 \\ 5/9 \end{bmatrix} \approx \begin{bmatrix} 0.33 \\ 0.22 \\ 0.55 \end{bmatrix} $$

🧠 Prediction Time Win $20

If $P = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$ (Deterministic Flip), what happens?