Law of Large Numbers

Understanding how the sample mean converges to the true expected value as the sample size increases.

1 Sample Mean

Let \( X_1, X_2, \dots, X_n \) be \( n \) independent and identically distributed (i.i.d.) random variables. The sample mean \( \overline{X}_n \) is defined as:

$$ \overline{X}_n = \frac{X_1 + X_2 + \dots + X_n}{n} $$

Properties of the sample mean:

  • Expectation: \( E[\overline{X}_n] = E[X] = \mu \)
  • Variance: \( \text{Var}(\overline{X}_n) = \frac{\text{Var}(X)}{n} = \frac{\sigma^2}{n} \)

Note that as \( n \) increases, the variance of the sample mean decreases, meaning it clusters more tightly around the true mean \( \mu \).

2 Weak Law of Large Numbers (WLLN)

The Weak Law of Large Numbers states that for any \( \epsilon > 0 \), the probability that the sample mean deviates from the true mean by more than \( \epsilon \) goes to zero as \( n \to \infty \).

$$ \lim_{n \to \infty} P(|\overline{X}_n - \mu| \ge \epsilon) = 0 $$

This is known as convergence in probability.

Interpretation: If you take a large enough sample, the sample average will be very close to the true expected value with high probability.

Interactive Simulation

Convergence of Sample Mean

Simulate rolling a die \( n \) times. The true mean is 3.5. Observe how the sample mean converges to 3.5 as \( n \) increases.

Total Rolls: 0