Joint Continuous Random Variables

Extending probability density to multiple dimensions. From joint PDFs to Bivariate Normal Distributions.

1 Joint Probability Density Function

Two random variables \( X \) and \( Y \) are jointly continuous if there exists a non-negative function \( f_{XY}(x,y) \) such that for any set \( A \in \mathbb{R}^2 \):

$$ P((X,Y) \in A) = \iint_A f_{XY}(x,y) \, dx \, dy $$

The function \( f_{XY}(x,y) \) is called the Joint PDF. It must satisfy the normalization condition:

$$ \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} f_{XY}(x,y) \, dx \, dy = 1 $$

2 Marginal PDFs

We can recover the individual PDFs of \( X \) and \( Y \) (called marginal PDFs) by integrating out the other variable.

Marginal PDF of X

$$ f_X(x) = \int_{-\infty}^{\infty} f_{XY}(x,y) \, dy $$

Marginal PDF of Y

$$ f_Y(y) = \int_{-\infty}^{\infty} f_{XY}(x,y) \, dx $$

3 Conditional PDF & Independence

The conditional PDF of \( X \) given \( Y=y \) is defined as:

$$ f_{X|Y}(x|y) = \frac{f_{XY}(x,y)}{f_Y(y)}, \quad \text{if } f_Y(y) > 0 $$

Two continuous random variables \( X \) and \( Y \) are independent if and only if their joint PDF factors into the product of their marginals:

$$ f_{XY}(x,y) = f_X(x) \cdot f_Y(y) $$

4 Bivariate Normal Distribution

The Bivariate Normal Distribution is determined by means \( \mu_X, \mu_Y \), variances \( \sigma_X^2, \sigma_Y^2 \), and the correlation coefficient \( \rho \).

$$ f_{XY}(x,y) = \frac{1}{2\pi\sigma_X\sigma_Y\sqrt{1-\rho^2}} \exp\left( -\frac{1}{2(1-\rho^2)} \left[ \frac{(x-\mu_X)^2}{\sigma_X^2} + \frac{(y-\mu_Y)^2}{\sigma_Y^2} - \frac{2\rho(x-\mu_X)(y-\mu_Y)}{\sigma_X\sigma_Y} \right] \right) $$

Interactive Bivariate Normal Explorer

Adjust the correlation coefficient \( \rho \) to see how it affects the shape of the joint distribution (heatmap) and the scatter of points.

Heatmap (Top View)

Sample Scatter Plot

5 Covariance & Correlation

Covariance

Measure of linear association between two variables.

$$ \text{Cov}(X,Y) = E[(X-\mu_X)(Y-\mu_Y)] $$ $$ = E[XY] - E[X]E[Y] $$

Correlation Coefficient

Normalized version of covariance, always between -1 and 1.

$$ \rho_{XY} = \frac{\text{Cov}(X,Y)}{\sigma_X \sigma_Y} $$

6 Method of Transformations

If we have joint RVs \( (X,Y) \) and transform them to \( (Z,W) = g(X,Y) \), we can find the joint PDF of \( (Z,W) \) using the Jacobian of the inverse transformation.

$$ f_{ZW}(z,w) = f_{XY}(h_1(z,w), h_2(z,w)) \cdot |J| $$

Where \( J \) is the determinant of the Jacobian matrix of the inverse transformation \( x=h_1(z,w), y=h_2(z,w) \):

$$ J = \det \begin{bmatrix} \frac{\partial x}{\partial z} & \frac{\partial x}{\partial w} \\ \frac{\partial y}{\partial z} & \frac{\partial y}{\partial w} \end{bmatrix} $$

7 Convolution (Sum of RVs)

If \( Z = X + Y \), the PDF of \( Z \) is the convolution of the joint PDF. If \( X \) and \( Y \) are independent, it simplifies to the convolution of their marginals.

$$ f_Z(z) = \int_{-\infty}^{\infty} f_X(x) f_Y(z-x) \, dx $$

Key Result: The sum of two independent Normal RVs is also Normal. $$ X \sim N(\mu_X, \sigma_X^2), Y \sim N(\mu_Y, \sigma_Y^2) \implies X+Y \sim N(\mu_X+\mu_Y, \sigma_X^2+\sigma_Y^2) $$

Check Your Understanding

1. Independence Check

If the joint PDF is \( f_{XY}(x,y) = 4xy \) for \( 0 < x < 1, 0 < y < 1 \), are \( X \) and \( Y \) independent?

2. Correlation & Independence

If \( \rho(X,Y) = 0 \), does this imply \( X \) and \( Y \) are independent?