Orthogonality—the generalization of perpendicularity—is one of the most powerful concepts in linear algebra. Orthonormal bases simplify nearly every computation and provide the foundation for Fourier analysis, quantum mechanics, and signal processing.
A set of vectors is orthogonal if every pair of distinct vectors is orthogonal:
A set is orthonormal if it is orthogonal and each vector has unit norm:
The symbol is the Kronecker delta, equal to 1 when and 0 otherwise. It compactly expresses the orthonormality condition.
In , the standard basis where:
is orthonormal: .
The set is orthogonal (inner product is 0) but not orthonormal (vectors don't have unit length).
To make orthonormal, normalize each vector:
An orthogonal set of nonzero vectors is linearly independent.
Suppose for orthogonal nonzero .
Take inner product with :
Since , we have , so . This holds for all .
An orthogonal set of nonzero vectors in an -dimensional space has at most vectors.
The set is orthogonal:
To make orthonormal, normalize:
If is orthogonal and , then:
Taking inner product of with :
since for .
For orthonormal sets, the denominator , so . This is why orthonormal sets are preferred for computation.
On , show and are orthogonal:
But and are NOT orthogonal:
If are pairwise orthogonal:
Expand the norm squared:
since for .
For orthogonal vectors and :
This is the classic 3-4-5 right triangle!
A square matrix is orthogonal if its columns form an orthonormal set, equivalently:
For complex matrices, the analogous concept is unitary: .
The 2D rotation matrix is orthogonal:
Verify:
Orthogonal matrices preserve:
They represent rotations and reflections—rigid motions of space.
If is orthogonal, then :
If , is a rotation. If , includes a reflection.
The reflection across the line in :
Verify: , (reflection, not rotation).
A complex matrix is unitary if where .
Unitary matrices are the complex analog of orthogonal matrices.
The matrix is unitary:
The set of orthogonal matrices forms a group . The subgroup with is (special orthogonal group, pure rotations).
An orthonormal basis is an orthonormal set that is also a basis (spans the space).
Every finite-dimensional inner product space has an orthonormal basis.
The Gram-Schmidt process (next module) provides an explicit algorithm to construct orthonormal bases from any basis.
The set is an orthonormal basis for :
Let be an orthonormal basis. For any :
Since is a basis, for unique scalars .
Taking inner product with :
With an orthonormal basis, finding coefficients is trivial—just compute inner products! For a general basis, you'd need to solve a linear system (invert a matrix).
Express in the orthonormal basis where , :
Verify: ✓
The set is orthonormal in :
Express :
If is orthonormal, its Gram matrix is the identity:
The matrix whose columns are orthonormal basis vectors satisfies . To change from standard coordinates to orthonormal coordinates:
With orthonormal basis , transform :
If is orthonormal and is a linear operator, the matrix representation has:
In orthonormal coordinates, the adjoint has matrix (conjugate transpose). is self-adjoint iff (Hermitian).
Transform from standard basis to orthonormal basis where , :
For :
Every vector has a unique representation in any orthonormal basis. The coefficients are determined entirely by and the basis.
| Property | General Basis | Orthonormal Basis |
|---|---|---|
| Find coefficients | Solve linear system | Inner products |
| Gram matrix | G (positive definite) | I (identity) |
| Norm formula | √(cᵀGc) | √(Σ|cᵢ|²) |
| Inner product | aᵀGb | Σaᵢb̄ᵢ |
For orthonormal basis and :
Parseval's identity says the norm of a vector equals the norm of its coefficient vector. Orthonormal bases preserve distances!
For any orthonormal set (not necessarily a basis) and any vector :
Let (residual after projecting onto span{eᵢ}).
Then for all , so is orthogonal to each .
Bessel's inequality becomes Parseval's equality iff is a complete orthonormal basis.
In , let (not a basis, just 2 vectors).
For :
Bessel: ✓
For in with standard orthonormal basis:
Parseval holds: ✓
For orthonormal basis with and :
In signal processing, Parseval's identity says the total energy in time domain equals total energy in frequency domain. The Fourier coefficients represent energy at each frequency.
For a function with Fourier coefficients :
This is Parseval for the Fourier orthonormal basis.
Let be orthonormal. The vector is the best approximation to in :
For any in the span:
This is minimized when , giving .
The best approximation is the orthogonal projection of onto the subspace spanned by . The residual is orthogonal to the subspace.
Approximate using only , :
Error by Bessel:
For orthonormal basis with , :
This extends Parseval from norms to general inner products.
Best approximation via orthogonal projection is the mathematical foundation of least squares:
For orthonormal basis , if and :
In orthonormal coordinates, the inner product becomes the standard dot product! This is why orthonormal bases are so convenient—they reduce all inner products to simple sums.
In orthonormal basis, if and are coefficient vectors:
In with standard orthonormal basis, for and :
Orthonormal coordinates preserve norms: if , then:
The norm of equals the Euclidean norm of its coefficient vector.
The coordinate map is an isometry (distance-preserving map) between the inner product space and .
Distance between and (orthonormal ):
In orthonormal coordinates, the angle formula simplifies:
Find angle between and in standard orthonormal basis:
So .
In machine learning, cosine similarity is defined as:
This measures direction similarity regardless of magnitude, ranging from -1 (opposite) to +1 (same direction).
The map defined by is a linear isometry:
Any orthonormal set in a finite-dimensional inner product space can be extended to an orthonormal basis.
Let be orthonormal with .
Since is linearly independent, extend to a basis .
Apply Gram-Schmidt to the new vectors, orthogonalizing against all previous ones.
Start with in . This is already unit length.
We can extend to where and .
Any orthonormal set with vectors in an -dimensional space is automatically a basis.
Extend to orthonormal basis of .
Step 1: Find vector orthogonal to , e.g.,
Step 2: Add and
Verify all pairwise orthogonality and unit norms.
If is orthonormal for subspace , extending to orthonormal basis gives:
Any is orthogonal to , so .
Conversely, any linear combination of is orthogonal to .
In , let . Extend to basis:
Then (the yz-plane).
For a subspace of :
Every finite-dimensional inner product space decomposes as:
Every can be uniquely written as with , .
Let . Then consists of vectors with :
Decompose :
For any subspace :
Clearly . By dimension counting:
So .
For subspaces :
These are analogs of De Morgan's laws for subspaces.
On , the set is orthonormal.
This leads to Fourier series: any function can be expanded as:
On , the Legendre polynomials (properly normalized) form an orthonormal basis:
Different inner products (weight functions) give different orthogonal polynomial families:
Show and are orthogonal on :
since is an odd function integrated over a symmetric interval.
For on , the Fourier sine coefficients are:
So for .
The Fourier system is complete in : for any with :
Approximate on using Legendre polynomials:
where .
Classical orthogonal polynomials satisfy a recurrence relation:
This allows efficient computation without explicit integration.
Legendre polynomials satisfy:
With , :
In infinite-dimensional Hilbert spaces:
Wavelet bases provide orthonormal systems with localization:
Used in signal processing, image compression (JPEG 2000), and denoising.
In a Hilbert space , every continuous linear functional has the form:
for a unique . This connects functionals to vectors via the inner product.
Orthogonal = perpendicular. Orthonormal = perpendicular AND unit length. Always normalize!
⟨0, v⟩ = 0 for all v, but 0 cannot be in an orthogonal set (we require nonzero vectors for linear independence).
Parseval (equality) only holds for complete orthonormal bases. For partial sets, use Bessel (inequality).
cₖ = ⟨v, eₖ⟩, NOT ⟨eₖ, v⟩ (matters for complex spaces due to conjugate symmetry).
For orthogonal (not orthonormal) sets, the coefficient formula has a denominator: cⱼ = ⟨v, vⱼ⟩/||vⱼ||².
To verify orthogonality of n vectors, check all n(n-1)/2 pairs. Missing one pair invalidates the whole set.
An orthogonal matrix has orthonormal columns (not just orthogonal). The term "orthogonal matrix" is slightly misleading.
Parseval requires the orthonormal set to be a basis for the ENTIRE space, not just a subspace.
Before using orthonormal basis properties:
| Theorem | Statement | Condition |
|---|---|---|
| Linear Independence | Orthogonal nonzero → independent | Vectors nonzero |
| Fourier Coefficients | cₖ = ⟨v, eₖ⟩ | Orthonormal basis |
| Parseval | ||v||² = Σ|cᵢ|² | Complete ON basis |
| Bessel | Σ|⟨v,eᵢ⟩|² ≤ ||v||² | Any ON set |
| Extension | ON set → ON basis | Finite dimension |
Orthonormal bases are the "natural" coordinate systems for inner product spaces:
If is the matrix with orthonormal basis vectors as columns:
Coordinate change is simply matrix-vector multiplication by .
Every matrix with independent columns can be written as :
This is the Gram-Schmidt process in matrix form!
The Fourier orthonormal basis transforms signals between time and frequency domains:
Quantum states live in Hilbert spaces with orthonormal bases:
Orthogonal transformations are preferred for numerical stability:
The DCT (Discrete Cosine Transform) uses orthonormal cosine basis for JPEG:
High-frequency coefficients (large ) are often small and can be discarded.
Orthonormal transformations are fundamental in 3D graphics:
Rotation by around z-axis:
Columns form orthonormal basis; , .
Orthogonality appears across mathematics and applications because:
Three main reasons: (1) Coefficients are trivial to compute as inner products, no matrix inversion needed. (2) The Gram matrix is the identity, simplifying all computations. (3) Parseval's identity gives ||v||² = Σ|cᵢ|², preserving norms.
Yes! The Gram-Schmidt process (next section) converts any basis to an orthonormal one spanning the same space. Every finite-dimensional inner product space has an orthonormal basis.
Orthogonal: pairwise inner products are zero (⟨eᵢ,eⱼ⟩ = 0 for i≠j). Orthonormal: additionally, each vector has unit length (||eᵢ|| = 1). Orthonormal = orthogonal + normalized.
For orthonormal basis {eᵢ}, the Fourier coefficient cᵢ = ⟨v, eᵢ⟩ is the component of v along eᵢ. The expansion v = Σcᵢeᵢ is called the Fourier expansion. In function spaces, this gives Fourier series.
Bessel: Σ|⟨v,eᵢ⟩|² ≤ ||v||² for any orthonormal set. It becomes Parseval's equality when {eᵢ} is a complete orthonormal basis spanning the space.
If Σcᵢeᵢ = 0, take inner product with eⱼ: cⱼ⟨eⱼ,eⱼ⟩ = 0 (other terms vanish by orthogonality). Since eⱼ ≠ 0, we have ||eⱼ||² > 0, so cⱼ = 0. This works for all j.
At most n vectors can be mutually orthogonal in ℝⁿ (since orthogonal nonzero vectors are linearly independent, and dim(ℝⁿ) = n). An orthonormal basis achieves this maximum.
The Fourier coefficient ⟨v, eᵢ⟩ is the signed length of v's projection onto eᵢ. It tells you 'how much of v is in the eᵢ direction.' The vector v is reconstructed by summing these projections.
Yes, but with subtleties. In Hilbert spaces, orthonormal bases exist and are countable. The Fourier basis {eⁱⁿˣ/√2π} is an orthonormal basis for L²[-π,π]. Parseval's identity extends to infinite sums.
Check two things: (1) Each pair has zero inner product: ⟨eᵢ,eⱼ⟩ = 0 for i≠j. (2) Each vector has unit norm: ||eᵢ|| = 1. Equivalently, form the Gram matrix G with Gᵢⱼ = ⟨eᵢ,eⱼ⟩ and verify G = I.
The algorithm to construct orthonormal bases from any basis. Converts any linearly independent set to an orthonormal one.
Project vectors onto subspaces using orthonormal bases. Foundation for least squares and best approximation.
Self-adjoint operators have orthonormal eigenbases. The culmination of inner product space theory.
Singular value decomposition uses two orthonormal bases. The most important matrix factorization.
Let and . Verify orthogonality and normalize:
Solution:
Orthonormal:
Express in the orthonormal basis from above, plus :
Solution:
So . Verify Parseval: ✓
Show that is NOT orthogonal on with standard inner product:
Solution:
Not orthogonal! The Legendre polynomials are the orthogonal version.
Given orthonormal basis and , compute :
Solution:
So .
Find for , (orthonormal basis):
Solution:
After this module, you should be able to:
Orthogonality concepts developed through several mathematical threads:
| Find coefficient cₖ | Compute ⟨v, eₖ⟩ |
| Find ||v|| | Use √(Σ|cₖ|²) (Parseval) |
| Find ⟨u, v⟩ | Compute Σaₖb̄ₖ |
| Verify orthonormality | Check ⟨eᵢ, eⱼ⟩ = δᵢⱼ for all i, j |
| Find projection | Compute Σ⟨v, eₖ⟩eₖ |