1 Joint Distributions in \( n \) Dimensions
For \( n \) continuous random variables \( X_1, X_2, \dots, X_n \), the joint PDF satisfies:
Independence: The variables are independent if the joint PDF factors into the product of marginal PDFs:
I.I.D. (Independent and Identically Distributed): If they are independent and share the same marginal distribution \( F_X(x) \).
2 Sums of Random Variables
Let \( Y = \sum_{i=1}^n X_i \). Linearity of expectation always holds:
Variance adds up only if they are uncorrelated (or independent):
Interactive Sum Simulator
See what happens when you sum \( n \) independent Uniform(0,1) random variables. As \( n \) increases, the distribution approaches a Normal distribution (Central Limit Theorem).
3 Moment Generating Functions (MGF)
The MGF of a random variable \( X \) is defined as \( M_X(s) = E[e^{sX}] \). It uniquely determines the distribution.
- Moments: \( E[X^k] = \frac{d^k}{ds^k} M_X(s) \big|_{s=0} \)
- Sum of Independent RVs: \( M_{X+Y}(s) = M_X(s) \cdot M_Y(s) \)
Common MGFs
Exponential(\(\lambda\))
\( \frac{\lambda}{\lambda - s}, \quad s < \lambda \)
Poisson(\(\lambda\))
\( e^{\lambda(e^s - 1)} \)
Binomial(n, p)
\( (pe^s + 1-p)^n \)
4 Random Vectors & Covariance Matrix
A random vector \( \mathbf{X} = [X_1, \dots, X_n]^T \) has a mean vector \( E[\mathbf{X}] \) and a covariance matrix \( C_X \):
The covariance matrix is always symmetric and positive semi-definite.
5 Multivariate Normal Distribution
A random vector \( \mathbf{X} \) is Multivariate Normal if any linear combination of its components is Normal. Its PDF is:
6 Solved Example: The Circle Problem
Problem: \( N \) people sit around a round table (\( N > 5 \)). Each person tosses a fair coin. Anyone whose outcome is different from both their neighbors receives a gift. Let \( X \) be the number of people who receive gifts. Find \( E[X] \).
Solution using Indicator Variables
Let \( I_i \) be an indicator variable for the \( i \)-th person receiving a gift. $$ X = I_1 + I_2 + \dots + I_N $$
For person \( i \) to receive a gift, their coin must be different from neighbor \( i-1 \) AND
neighbor \( i+1 \).
Possible winning patterns: (H, T, H) or (T, H, T).
Probability \( P(I_i = 1) = (1/2)^3 + (1/2)^3 = 1/8 + 1/8 = 1/4 \).
By Linearity of Expectation: $$ E[X] = \sum_{i=1}^N E[I_i] = \sum_{i=1}^N \frac{1}{4} = \frac{N}{4} $$
Check Your Understanding
1. Variance of Sum
If \( X \) and \( Y \) are independent, what is \( \text{Var}(X - Y) \)?
2. MGF of Sum
If \( X \) and \( Y \) are independent, what is \( M_{X+Y}(s) \)?