Approximation

Roots of Polynomials & Interpolation

Finding zeros of complex polynomials and fitting curves to data points.

1

Roots of Polynomials

Complications

  • Repeated Roots: Reduce convergence speed.
  • Complex Roots: Require complex arithmetic.
  • Sensitivity: Roots can be highly sensitive to small changes in coefficients (Conditioning).
Polynomial Roots Plot

Companion Matrix Method

MATLAB's roots function doesn't use Newton's method. It finds the eigenvalues of a special matrix!

\[ c_1 x^n + \dots + c_n x + c_{n+1} = 0 \implies \text{Eigenvalues of } A \] \[ A = \begin{bmatrix} 0 & 1 & 0 & \dots \\ 0 & 0 & 1 & \dots \\ \vdots & & & \ddots \\ -c_{n+1}/c_1 & -c_n/c_1 & \dots & -c_2/c_1 \end{bmatrix} \]
% MATLAB Example c = [1 -3 2]; % x^2 - 3x + 2 r = roots(c); % Returns [2; 1]
2

Interpolation: The Goal

Given $n+1$ points $(x_0, y_0), \dots, (x_n, y_n)$, find a polynomial $p(x)$ of degree $n$ such that $p(x_i) = y_i$.

Uniqueness Theorem

If points $x_0, \dots, x_n$ are distinct, there is a unique polynomial $p(x)$ of degree at most $n$ that interpolates the data.

3

Monomial Basis

The "obvious" approach: $p(x) = a_0 + a_1 x + \dots + a_n x^n$.

Vandermonde Matrix

\[ \begin{bmatrix} 1 & x_0 & x_0^2 & \dots \\ 1 & x_1 & x_1^2 & \dots \\ \vdots & \vdots & \vdots & \ddots \\ 1 & x_n & x_n^2 & \dots \end{bmatrix} \begin{bmatrix} a_0 \\ a_1 \\ \vdots \end{bmatrix} = \begin{bmatrix} y_0 \\ y_1 \\ \vdots \end{bmatrix} \]

Problem: This matrix is often ill-conditioned!

4

Lagrange Interpolation

Construct the polynomial as a sum of basis functions $\ell_k(x)$ where $\ell_k(x_i) = 1$ if $i=k$ and $0$ otherwise.

\[ p(x) = \sum_{k=0}^n y_k \ell_k(x), \quad \text{where } \ell_k(x) = \prod_{i \ne k} \frac{x - x_i}{x_k - x_i} \]

Pros

  • Conceptually simple.
  • No system of equations to solve.

Cons

  • Expensive to evaluate ($\mathcal{O}(n^2)$).
  • Hard to add a new data point (must recalculate everything).
5

Newton Form & Divided Differences

The "Smart" approach. Uses a basis that allows for easy updates and efficient evaluation (Horner's method).

\[ p_n(x) = a_0 + a_1(x-x_0) + a_2(x-x_0)(x-x_1) + \dots \]

Divided Difference Table

Recursive calculation of coefficients $a_k$.

x0 y0
      [y1-y0]/[x1-x0]
x1 y1

Interactive Example

Points: $(1,3), (1.5, 3.25), (0,3), (2, 1.67)$

x f[] f[,] f[,,]
1 3
1.5 3.25 0.5
0 3 0.16 0.33

Coefficients are the top diagonal!

🧠 Final Challenge +20 XP

Why is the Newton form often preferred over Lagrange for interpolation?