Singular Value Decomposition

Chapter 8 • Section 8-5

"The pinnacle of linear algebra. 🏔️"

👑 The King of Factorizations

Every matrix $A$ (even rectangular ones!) can be factored as:

$$ A = U \Sigma V^T $$
  • $U$: Orthogonal matrix (Left Singular Vectors)
  • $\Sigma$: Diagonal matrix with non-negative entries (Singular Values $\sigma_i$)
  • $V$: Orthogonal matrix (Right Singular Vectors)

This tells us that any linear transformation is just a rotation ($V^T$), followed by a stretch ($\Sigma$), followed by another rotation ($U$).

SVD Existence Theorem

Let $A$ be an $m \times n$ matrix with rank $r$. Then there exists an $m \times n$ matrix $\Sigma$ as defined above, and orthogonal matrices $U$ ($m \times m$) and $V$ ($n \times n$) such that $A = U \Sigma V^T$.

The singular values $\sigma_i$ are the square roots of the eigenvalues of $A^T A$.

Eigenvalue Theorem

For any $m \times n$ matrix $A$, the matrices $A^T A$ and $A A^T$ have the same nonzero eigenvalues.

This ensures that the singular values are well-defined regardless of which product we use.

🧮 Computing the SVD

To find $A = U \Sigma V^T$:

  1. Find eigenvalues $\lambda_i$ of $A^T A$. Singular values $\sigma_i = \sqrt{\lambda_i}$.
  2. Find orthonormal eigenvectors $\vec{v}_i$ of $A^T A$. These form $V$.
  3. Compute $\vec{u}_i = \frac{1}{\sigma_i} A \vec{v}_i$. These form the first $r$ columns of $U$.
  4. If needed, extend $\{\vec{u}_1, \dots, \vec{u}_r\}$ to an orthonormal basis for $\mathbb{R}^m$ to get the rest of $U$.

Example

Let $A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \\ 1 & 0 \end{bmatrix}$.

1. $A^T A = \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix}$. Eigenvalues: 3, 1. Singular values: $\sqrt{3}, 1$.

2. Eigenvectors of $A^T A$: $\vec{v}_1 = \frac{1}{\sqrt{2}}(1, 1)$, $\vec{v}_2 = \frac{1}{\sqrt{2}}(-1, 1)$.

3. $\vec{u}_1 = \frac{1}{\sqrt{3}} A \vec{v}_1 = \frac{1}{\sqrt{6}}(2, 1, 1)$. $\vec{u}_2 = \frac{1}{1} A \vec{v}_2 = \frac{1}{\sqrt{2}}(0, 1, -1)$.

4. Extend $U$: Find $\vec{u}_3$ orthogonal to $\vec{u}_1, \vec{u}_2$. $\vec{u}_3 = \frac{1}{\sqrt{3}}(-1, 1, 1)$.

🧩 The Four Fundamental Subspaces

SVD perfectly reveals the geometry of the four subspaces.

Row Space & Null Space (in $V$)

  • The first $r$ columns of $V$ span the Row Space of $A$.
  • The last $n-r$ columns of $V$ span the Null Space of $A$.

Col Space & Left Null Space (in $U$)

  • The first $r$ columns of $U$ span the Column Space (Image) of $A$.
  • The last $m-r$ columns of $U$ span the Left Null Space of $A$.

❄️ Polar Decomposition

Just as a complex number $z$ can be written as $z = re^{i\theta}$ (scale $\times$ rotation), any square matrix $A$ can be factored as:

$$ A = Q S $$
  • $Q$: Orthogonal matrix (Rotation/Reflection).
  • $S$: Positive semi-definite symmetric matrix (Stretching).

We can find this from SVD: $A = (UV^T)(V \Sigma V^T) = Q S$.

🚀 Applications

Generalized Inverse

For non-square matrices, we can define a "pseudoinverse" $A^+$:

$$ A^+ = V \Sigma^+ U^T $$

Where $\Sigma^+$ takes reciprocals of non-zero singular values.

Unit Ball Mapping

The image of the unit ball $\{\vec{x} : \|\vec{x}\| \le 1\}$ under $A$ is an ellipsoid.

  • Principal axes are $\sigma_i \vec{u}_i$.
  • Lengths are the singular values $\sigma_i$.

SVD for Image Compression

SVD allows us to approximate a matrix (an image) using only the largest singular values. This is the basis of image compression!

Slide to change compression level. Lower k = more compression, less detail.

🧠 Knowledge Check Win $20

If $A$ is a symmetric positive definite matrix, how are its singular values related to its eigenvalues?