System Analysis & Efficiency

Conditioning & Banded Systems

Understanding accuracy limits, the condition number, and optimizing for special matrix structures.

🎯 Lecture Goals

  • 1 Accuracy: How do we know if GE results are good? (Condition Number).
  • 2 Efficiency: Can we beat $\mathcal{O}(n^3)$ for special systems? (Tridiagonal, Banded).

Key Insight

A small residual ($r = b - Ax$) does NOT always mean a small error ($e = x_{true} - x_{calc}$).

The Condition Number is the bridge.

1

Geometric Singularity

Visualizing linear systems as intersecting lines helps understand ill-conditioning.

Robust System

Lines intersect cleanly at a large angle.

2x + y = 6
-0.5x + y = 1
rank(A) = 2

Ill-Conditioned

Lines are nearly parallel. Tiny changes in lines move intersection hugely.

2x + y = 6
(2+δ)x + y = 6+δ

Numerically indistinguishable from singular.

2

Condition Number $\kappa(A)$

The sensitivity of the solution $x$ to perturbations in $b$ or $A$ is bound by the condition number.

$\kappa(A) = \|A\| \|A^{-1}\|$

Range: $[1, \infty)$

Perturbation Theory

If we perturb $b$ by $\delta b$, how much does $x$ change?

$\frac{\|\delta x\|}{\|x\|} \le \kappa(A) \frac{\|\delta b\|}{\|b\|}$

$\kappa(A)$ acts as an error amplifier.

Digits of Precision

How many trustworthy digits $d$ do we have?

$d \approx |\log_{10}(\epsm)| - \log_{10}(\kappa(A))$

Example: $\epsm \approx 10^{-16}$. If $\kappa(A) = 10^{10}$, you have only $\approx 6$ correct digits!

3

Computational Stability

Backward Stability

"[Backward stable algorithms] give exactly the right answer to nearly the right question." — Trefethen & Bau

Gaussian Elimination with Partial Pivoting is backward stable. It solves $(A+E)\hat{x} = b$ where $E$ is small.

The Residual Trap

We can calculate the residual $r = b - A\hat{x}$. It simply checks if the equation balances.

$\frac{\|\hat{x} - x\|}{\|x\|} \le \kappa(A) \frac{\|r\|}{\|b\|}$

WARNING: If $\kappa(A)$ is large, a tiny residual can still mean a massive error in $x$.

4

MATLAB's Backslash Operator

In MATLAB or Python (NumPy), we rarely type these algorithms manually. We use smart "poly-algorithms".

x = A \ b % The Magic Operator

What happens under the hood?

1 Is $A$ triangular? → Use Forward/Back substitution.
2 Is $A$ Symmetric Positive Definite? → Try Cholesky Factorization.
3 Otherwise → Use LU Factorization (Gaussian Elimination) with Partial Pivoting.
5

Special Systems: Tridiagonal

Many physics and engineering problems connect only neighbors (e.g., springs, heat flow). This creates sparse, banded matrices.

\[ A = \begin{bmatrix} d_1 & c_1 & 0 & \dots \\ a_1 & d_2 & c_2 & \dots \\ 0 & a_2 & d_3 & \dots \\ \vdots & \vdots & \vdots & \ddots \end{bmatrix} \]

Storage Advantage

Full matrix: $n^2$ doubles.
Tridiagonal: $3n-2$ doubles.
Huge savings for large $n$.

Speed Advantage

Standard GE: $\mathcal{O}(n^3)$
Tridiagonal GE: $\mathcal{O}(n)$ (Linear time!)

Thomas Algorithm (Tridiagonal Solver)

for i = 2 to n m = a(i-1) / d(i-1) d(i) = d(i) - m * c(i-1) b(i) = b(i) - m * b(i-1) end % Back sub follows similarly...

🧠 Final Challenge +20 XP

You solve $Ax=b$ and get a tiny residual $r \approx 10^{-16}$. However, $\kappa(A) \approx 10^{14}$. Is your solution accurate?

Interactive Computing
6

Interactive Playground

Investigate Ill-Conditioned Matrices and the Hilbert Matrix. You can run MATLAB code via Octave Online or Python code directly in your browser.

Octave MATLAB / Octave (Online)

Copy & Paste into Terminal:

Python Python (Client-Side)

Output
Loading Python environment...