Markov Chains
Chapter 2 • Section 2-9
"Predicting the future, one probability at a time! 🔮"
🎲 The Setup
A Markov Chain models a system that jumps between states over time.
State Vector ($S_k$)
A vector of probabilities summing to 1. Represents the state at step $k$.
Transition Matrix ($P$)
Contains probabilities of moving from state $j$ to state $i$. Columns sum to 1 (Stochastic).
☀️ Weather Simulator
Let's model the weather!
States: Sunny (0) and Rainy (1).
If Sunny, 80% chance next day is Sunny.
If Rainy, 60% chance next day is Rainy.
Current Day
Sunny
Day Count
Notice: Over time, it settles into a pattern!
The Steady State
Does the system settle down? Often, yes!
Finding the Equilibrium
We look for a vector $S$ that doesn't change:
This is just finding the eigenvector for $\lambda = 1$! Don't forget to scale it so entries sum to 1.
Regular Matrices
A stochastic matrix $P$ is regular if some power $P^k$ contains only strictly positive entries.
Theorem: If $P$ is regular, a unique steady state vector exists.
Example: The Wolf Pack
A wolf pack hunts in 3 regions ($R_1, R_2, R_3$). Transition matrix:
Is it regular? Yes, calculate $P^2$ and you'll see all entries are positive.
Steady State $S$ (where $PS=S$):
🧠 Prediction Time Win $20
If $P = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$ (Deterministic Flip), what happens?