Linear Algebra Recall
Vectors, Norms, Matrix Views, and Special Matrices. A deep dive into the building blocks.
1 Vector Operations
Basic Arithmetic
Operations are element-by-element.
-
Addition:
$c = a + b \iff c_i = a_i + b_i$ -
Scaling:
$b = \sigma a \iff b_i = \sigma a_i$ -
Transpose:
$u = [1, 2, 3] \implies u^T = \begin{bmatrix}1\\2\\3\end{bmatrix}$ -
Linear Combination:
$z = \alpha u + \beta v$
Inner Product ($u \cdot v$)
Scalar result. Geometric interpretation:
$u^T v = \sum u_i v_i = \|u\|_2 \|v\|_2 \cos\theta$
Commutative: $u^T v = v^T u$.
Outer Product ($u v^T$)
Matrix result (Rank-1). Element $A_{ij} = u_i v_j$.
Matlab Implementation
2 Vector Norms
Norms allow us to compare the "size" or "magnitude" of vectors, similar to absolute value for scalars.
$L_1$ Norm
Manhattan Distance
Sum of absolute values.
$L_2$ Norm
Euclidean Distance
Standard geometric length. $\sqrt{x^T x}$.
$L_\infty$ Norm
Max Norm
Magnitude of the largest element.
Comparing Vectors
To check if $y \approx z$, we use the relative error with norms:
3 Matrix Views
Understanding matrix-vector multiplication $Ax=b$ in different ways is crucial for intuition.
Column View
"Linear Combination of Columns"
The vector $b$ is constructed by scaling and adding the columns of $A$. This reveals the Range.
Row View
"Dot Product with Rows"
Each element $b_i$ is the projection of $x$ onto the $i$-th row of $A$. Good for hand calculations.
4 Independence & Rank
Linear Independence
A set of vectors $\{v_1, \dots, v_n\}$ is linearly independent if the only solution to:
is the trivial solution $\alpha_1 = \dots = \alpha_n = 0$.
Matrix Rank
The rank of matrix $A$ is the number of linearly independent columns (or rows). It measures the "dimension" of the output space.
- $\rank{A} \le \min(m, n)$.
- Full Rank: Maximum possible rank.
- Rank Deficient: Columns are dependent.
Numerical Rank is Tricky!
Tiny computer errors can make a singular matrix appear non-singular.
5 Special Matrices
Identity ($I$)
Like multiplying by 1. $AI = A$. Created with `eye(n)`.
Diagonal
Non-zeros only on main diagonal. Created with `diag(v)`.
Symmetric
Mirror image across diagonal. $A^TA$ is always symmetric.
Orthogonal ($Q$)
Columns are orthonormal. Preserves norms: $\|Qx\|_2 = \|x\|_2$.
Tridiagonal
Common in 1D Finite Difference problems. Sparse storage.
🧠 Knowledge Check +20 XP
Which of the following statements about Matrix-Matrix multiplication $C=AB$ is FALSE?