1. Why Numerical Calculus? 🤔
Slides 1-4The Motivation
- Black Box Functions: You can't analytically differentiate a simulation or experiment.
- Complexity: Formulas like $\frac{d}{dx}(x \sin(x^2)\ln(x))$ are error-prone by hand.
- Discrete Data: Real world data comes in points, not equations.
🤖
"Computers are bad at limits ($h \to 0$), but excellent at loops."
2. Numerical Differentiation
Slides 5-9Forward Difference
$$ f'(x) \approx \frac{f(x+h) - f(x)}{h} $$
Looks simple, but suffers from the Step Size Dilemma. Too big = bad approximation. Too small = round-off noise.
3. Central Difference 🎯
Slides 10-13Look both ways! Cancellations lead to better accuracy.
$$ f'(x) \approx \frac{f(x+h) - f(x-h)}{2h} $$
Error drops from $O(h)$ to $O(h^2)$.
4. Richardson Extrapolation 🚀
Slides 14-17Combine two bad estimates to make one good one.
5. Numerical Integration 🧱
Slides 31-40Trapezoidal Rule
Connect the dots with lines. Calculate area of trapezoids.
$$ \int_a^b f(x) dx \approx \frac{h}{2} [f(x_0) + 2f(x_1) + \dots + 2f(x_{n-1}) + f(x_n)] $$
5.1 Trapezoidal Implementation
Slides 31-40 (Cont.)5.2 Simpson's Rule 🍩
Slides 31-40 (Cont.)Simpson's Rule
Connect dots with parabolas. Weights: 1, 4, 1.
6. Advanced: Gaussian & SciPy 🧠
Slides 43-49Gaussian Quadrature
Pick smart points $x_i$ and weights $w_i$. Integration becomes exact for polynomials.
SciPy `quad`
The industrial standard. Adaptive, fast, robust.