Multiple Random Variables

Scaling up to \( n \) dimensions. Random Vectors, Moment Generating Functions, and the Multivariate Normal.

1 Joint Distributions in \( n \) Dimensions

For \( n \) continuous random variables \( X_1, X_2, \dots, X_n \), the joint PDF satisfies:

$$ P((X_1, \dots, X_n) \in A) = \int \dots \int_A f_{X_1 \dots X_n}(x_1, \dots, x_n) \, dx_1 \dots dx_n $$

Independence: The variables are independent if the joint PDF factors into the product of marginal PDFs:

$$ f_{X_1 \dots X_n}(x_1, \dots, x_n) = f_{X_1}(x_1) \cdot f_{X_2}(x_2) \cdots f_{X_n}(x_n) $$

I.I.D. (Independent and Identically Distributed): If they are independent and share the same marginal distribution \( F_X(x) \).

2 Sums of Random Variables

Let \( Y = \sum_{i=1}^n X_i \). Linearity of expectation always holds:

$$ E[Y] = \sum_{i=1}^n E[X_i] $$

Variance adds up only if they are uncorrelated (or independent):

$$ \text{Var}(Y) = \sum_{i=1}^n \text{Var}(X_i) + 2 \sum_{i

Interactive Sum Simulator

See what happens when you sum \( n \) independent Uniform(0,1) random variables. As \( n \) increases, the distribution approaches a Normal distribution (Central Limit Theorem).

3 Moment Generating Functions (MGF)

The MGF of a random variable \( X \) is defined as \( M_X(s) = E[e^{sX}] \). It uniquely determines the distribution.

  • Moments: \( E[X^k] = \frac{d^k}{ds^k} M_X(s) \big|_{s=0} \)
  • Sum of Independent RVs: \( M_{X+Y}(s) = M_X(s) \cdot M_Y(s) \)

Common MGFs

Exponential(\(\lambda\))

\( \frac{\lambda}{\lambda - s}, \quad s < \lambda \)

Poisson(\(\lambda\))

\( e^{\lambda(e^s - 1)} \)

Binomial(n, p)

\( (pe^s + 1-p)^n \)

4 Random Vectors & Covariance Matrix

A random vector \( \mathbf{X} = [X_1, \dots, X_n]^T \) has a mean vector \( E[\mathbf{X}] \) and a covariance matrix \( C_X \):

$$ C_X = E[(\mathbf{X} - E[\mathbf{X}])(\mathbf{X} - E[\mathbf{X}])^T] $$

The covariance matrix is always symmetric and positive semi-definite.

5 Multivariate Normal Distribution

A random vector \( \mathbf{X} \) is Multivariate Normal if any linear combination of its components is Normal. Its PDF is:

$$ f_\mathbf{X}(\mathbf{x}) = \frac{1}{(2\pi)^{n/2} |\Sigma|^{1/2}} \exp\left( -\frac{1}{2} (\mathbf{x} - \mathbf{\mu})^T \Sigma^{-1} (\mathbf{x} - \mathbf{\mu}) \right) $$

6 Solved Example: The Circle Problem

Problem: \( N \) people sit around a round table (\( N > 5 \)). Each person tosses a fair coin. Anyone whose outcome is different from both their neighbors receives a gift. Let \( X \) be the number of people who receive gifts. Find \( E[X] \).

Solution using Indicator Variables

Let \( I_i \) be an indicator variable for the \( i \)-th person receiving a gift. $$ X = I_1 + I_2 + \dots + I_N $$

For person \( i \) to receive a gift, their coin must be different from neighbor \( i-1 \) AND neighbor \( i+1 \).
Possible winning patterns: (H, T, H) or (T, H, T).
Probability \( P(I_i = 1) = (1/2)^3 + (1/2)^3 = 1/8 + 1/8 = 1/4 \).

By Linearity of Expectation: $$ E[X] = \sum_{i=1}^N E[I_i] = \sum_{i=1}^N \frac{1}{4} = \frac{N}{4} $$

Check Your Understanding

1. Variance of Sum

If \( X \) and \( Y \) are independent, what is \( \text{Var}(X - Y) \)?

2. MGF of Sum

If \( X \) and \( Y \) are independent, what is \( M_{X+Y}(s) \)?