📚 Syllabus
Chapter 1: Introduction
-
➡️
Lecture 1: Introduction to Numerical Methods
Numerical vs Symbolic Calculation, Core Concepts, and History.
-
➡️
Lecture 2: Experiments, Matlab & Randomness
Matlab environment, Big-O notation, Convergence, and Random Number Generators.
-
➡️
Lecture 3: Numerical Representation
Floating Point, Base Conversion, Rounding vs Truncation, and IEEE-754.
-
➡️
Lecture 4: IEEE Floating Point Standard
Precision, Machine Epsilon, Cancellation Error, and the Pentium Bug.
-
➡️
Lecture 5: Matrix, Vector Operations
Cost analysis, basic solution schemes, and why we never invert a matrix.
-
➡️
Lecture 5b: Linear Algebra Recall
Vectors, Norms, Matrix Views, Independence, and Special Matrices.
-
➡️
Lecture 6: Gaussian Elimination
Solving Triangular Systems, Gaussian Elimination, LU Factorization, and Naive assumptions.
-
➡️
Lecture 7: Gaussian Elimination with Pivoting
Partial Pivoting, Full Pivoting, Scaled Partial Pivoting, and Condition Number.
-
➡️
Lecture 8: Conditioning & Banded Systems
Error Analysis, Condition Number, Backwards Stability, and Tridiagonal Solvers.
-
➡️
Lecture 9: Banded, LU, and Cholesky
Tridiagonal Systems, LU Decomposition, and Cholesky Factorization for SPD matrices.
-
➡️
Lecture 10: Sparse Matrices & Iterative Methods
Sparse Formats (COO, CSR), LSA, and Iterative Solvers.
-
➡️
Lecture 11: Rootfinding
Bisection, Newton's Method, and Convergence Analysis.
-
➡️
Lecture 12: Bisection & Newton's Method
Detailed Algorithms, Convergence Speed, and Secant Method.
-
➡️
Lecture 13: The Secant Method
Derivative-Free Rootfinding, Convergence Rate, and fzero.
-
➡️
Lecture 14: Roots of Polynomials & Interpolation
Finding zeros of complex polynomials and fitting curves to data points.
-
➡️
Lecture 15: Interpolation & Splines
Runge's Phenomenon, Chebyshev Nodes, and Cubic Splines.
-
➡️
Lecture 16: Splines & Bézier Curves
Approximation vs interpolation, Natural Splines derivation, and Bézier curves.
-
➡️
Lecture 17: Newton-Cotes Integration
Riemann Sums, Trapezoid Rule, and Simpson's Rule.
-
➡️
Lecture 18: Gauss Quadrature
Optimal node placement, Legendre Polynomials, and Degree of Precision.
-
➡️
Lecture 19: Numerical Differentiation
Finite Difference formulas, Taylor Series, and Richardson Extrapolation.
-
➡️
Lecture 20: Iterative Methods
Jacobi, Gauss-Seidel, and Conjugate Gradients for solving Ax=b.
-
➡️
Lecture 21: Iterative Methods, Google & Monte Carlo
Conjugate Gradients, PageRank, and Monte Carlo integration.
-
➡️
Lecture 22: Google, Markov Chains, Power Method, SVD
PageRank implementation, Power Method for eigenvalues, and SVD decomposition.
-
➡️
Lecture 23: Least Squares and SVD
Curve Fitting, Normal Equations, QR, and SVD approaches to least squares.
-
➡️
Lecture 24: Monte Carlo Methods
Integration, Probability, Randomness, and Central Limit Theorem implications.
-
➡️
Lecture 25: Randomness
Probability, Monte Carlo Integration, and Random Number Generators (LCG).
-
➡️
Lecture 26: Fast Fourier Transform (FFT)
DFT, Recursive FFT Algorithm, Signal Processing, and Complexity Analysis.