1 What is a Random Variable?
A Random Variable is actually a misnomer—it's not a variable, and it's not random! It is a function that maps outcomes from a sample space \(\Omega\) to real numbers \(\mathbb{R}\).
Think of it as a "measurement" of an outcome. For example, if you flip a coin 3 times, the outcome might be \( (H, H, T) \). A random variable \( X \) could be "the number of Heads". In this case, \( X((H, H, T)) = 2 \).
Discrete Random Variables
Takes on a countable number of distinct values (e.g., 0, 1, 2, ...).
- Number of heads in 10 flips
- Number of students in a class
- Roll of a die
Continuous Random Variables
Takes on an uncountably infinite number of values in an interval.
- Height of a person
- Time until a bus arrives
- Temperature of a room
2 Probability Mass Function (PMF)
For a discrete random variable \( X \), the Probability Mass Function (PMF), denoted as \( p_X(x) \), gives the probability that \( X \) takes on a specific value \( x \).
Definition
$$ p_X(x) = P(X = x) $$Properties
- \( 0 \le p_X(x) \le 1 \) for all \( x \)
- \( \sum_x p_X(x) = 1 \)
Interactive PMF Builder
Adjust the probabilities for values 1, 2, 3, and 4. The system will normalize them to ensure they sum to 1.
3 Cumulative Distribution Function (CDF)
The Cumulative Distribution Function (CDF), denoted as \( F_X(x) \), gives the probability that \( X \) is less than or equal to \( x \).
The CDF is always a non-decreasing step function for discrete random variables. It starts at 0 and goes up to 1.
Corresponding CDF
This graph shows the CDF based on the PMF values you set above. Notice the "steps" at each value.
4 Expectation (Expected Value)
The Expectation (or mean) of a random variable \( X \), denoted as \( E[X] \), is the weighted average of all possible values that \( X \) can take. It represents the "center of mass" of the distribution.
Properties of Expectation
Linearity
Expectation is a linear operation:
\( E[aX + b] = a E[X] + b \)
Sum of RVs
The expectation of a sum is the sum of expectations:
\( E[X + Y] = E[X] + E[Y] \)
Function of an RV (LOTUS)
Law of the Unconscious Statistician:
\( E[g(X)] = \sum_{x} g(x) p_X(x) \)
5 The St. Petersburg Paradox
Consider a game where a fair coin is tossed until it comes up Heads. If it comes up Heads on the \( n \)-th toss, you win \( 2^n \) dollars.
- Probability of Heads on 1st toss: \( 1/2 \), Prize: $2
- Probability of Heads on 2nd toss: \( 1/4 \), Prize: $4
- Probability of Heads on 3rd toss: \( 1/8 \), Prize: $8
How much would you pay to play this game?
Expected Value Calculation
$$ E[X] = \sum_{n=1}^{\infty} 2^n \cdot \frac{1}{2^n} = \sum_{n=1}^{\infty} 1 = 1 + 1 + 1 + \dots = \infty $$The expected payout is infinite! Yet, most people wouldn't pay more than a few dollars to play. This discrepancy between the infinite expected value and the finite practical value is the paradox.