Numerical Differentiation
Deriving derivatives from discrete data: Taylor series, finite differences, and extrapolation.
Motivation
Simulating Physics
Consider a particle system (e.g., game physics). Newton's Law $\vf = \frac{d(m\vv)}{dt}$ relates force to the change in momentum.
- Position: $\vp_i = [x_i, y_i, z_i]$
- Velocity: $\vv_i = \vp'_i$ (Derivative of position)
- Acceleration: $\vv'_i = \frac{1}{m_i} \vf_i$ (Derivative of velocity)
We need to compute these derivatives numerically!
Problem Statement
Given function values at evenly spaced points ($f(x)$, $f(x+h)$, $f(x-h)$), how do we approximate $f'(x)$?
Taylor Series Foundation
We can manipulate these series to isolate $f'(x)$.
Finite Difference Formulas
Forward Difference
$f'(x) \approx \frac{f(x+h)-f(x)}{h}$
Error: $\mathcal{O}(h)$
Backward Difference
$f'(x) \approx \frac{f(x)-f(x-h)}{h}$
Error: $\mathcal{O}(h)$
Central Difference
$f'(x) \approx \frac{f(x+h)-f(x-h)}{2h}$
Error: $\mathcal{O}(h^2)$
Why is Central Difference Better?
Subtracting the forward and backward Taylor expansions cancels out the $h^2$ terms, leaving an error term proportional to $h^2$. This converges much faster than $\mathcal{O}(h)$.
Richardson Extrapolation
Can we get even better accuracy? Yes! By combining results from step sizes $h$ and $h/2$.
The Idea
Let $\phi(h)$ be the central difference approximation. We know:
Combine to Eliminate Error
Error Order: $\mathcal{O}(h^4)$
Going Further
We can repeat this process! Combining $\mathcal{O}(h^4)$ estimates gives an $\mathcal{O}(h^6)$ estimate, and so on.
🧠 Knowledge Check +20 XP
If you halve the step size $h$, by what factor does the error decrease for the Central Difference method?