Vectors & Matrices

The fundamental building blocks of linear algebra and modern computation.

Vectors

A vector is an ordered list of numbers representing magnitude and direction. In ℝⁿ, vectors can represent points, displacements, or abstract data.

Vector: v = (v₁, v₂, …, vₙ)
Magnitude: ‖v‖ = √(v₁² + v₂² + … + vₙ²)
Dot product: u·v = u₁v₁ + u₂v₂ + … + uₙvₙ
cos θ = (u·v)/(‖u‖·‖v‖)

The magnitude formula generalizes the distance formula from coordinate geometry. The dot product connects to trigonometric functions via the angle between vectors.

Matrices

A matrix is a rectangular array of numbers. An m×n matrix has m rows and n columns.

Identity matrix: I (1s on diagonal, 0s elsewhere)
Transpose: (Aᵀ)ᵢⱼ = Aⱼᵢ
Symmetric: A = Aᵀ

Special matrices: diagonal, upper/lower triangular, symmetric, orthogonal (QᵀQ = I). The identity matrix I acts like 1 in matrix multiplication — similar to how 1 is the multiplicative identity.

Matrix Operations

  1. Addition: (A + B)ᵢⱼ = Aᵢⱼ + Bᵢⱼ (same dimensions required)
  2. Scalar multiplication: (cA)ᵢⱼ = c·Aᵢⱼ
  3. Matrix multiplication: (AB)ᵢⱼ = Σₖ Aᵢₖ·Bₖⱼ (inner dimensions must match)
  4. Inverse: AA⁻¹ = A⁻¹A = I (only for square, non-singular matrices)
Matrix multiplication is not commutative: AB ≠ BA in general. However, it is associative: (AB)C = A(BC). This algebraic structure is fundamental to linear transformations.

Solving Linear Systems

Linear systems Ax = b can be solved via:

These generalize the methods for solving systems of equations to any number of variables. In regression analysis, the normal equations have the form b = (XᵀX)⁻¹Xᵀy.

Determinants

2×2: det(A) = ad − bc
3×3: cofactor expansion along any row/column
Properties: det(AB) = det(A)·det(B), det(Aᵀ) = det(A)

The determinant encodes whether a matrix is invertible (det ≠ 0), the scaling factor of the associated transformation, and the signed volume of the parallelepiped formed by column vectors.