The spectral theorem is one of the crown jewels of linear algebra. It states that self-adjoint operators (symmetric matrices) can be orthogonally diagonalized—they have an orthonormal basis of eigenvectors with real eigenvalues. This has profound implications for physics, statistics, and engineering.
Let be a linear operator on inner product space . The adjoint is the unique operator satisfying:
For matrices with standard inner product:
An operator is self-adjoint (or Hermitian) if:
Equivalently: for all .
A real matrix is self-adjoint iff (symmetric).
is symmetric, hence self-adjoint.
A complex matrix is Hermitian (self-adjoint) iff :
Note: diagonal entries must be real.
For any linear operator on a finite-dimensional inner product space, the adjoint exists and is unique.
For fixed , the map is a linear functional.
By the Riesz Representation Theorem, there exists unique such that .
Define . This is linear and unique.
For with matrix :
An operator is skew-adjoint (or anti-Hermitian) if:
Skew-adjoint operators have purely imaginary eigenvalues.
This is skew-symmetric (). Eigenvalues are .
An operator is normal if:
Self-adjoint, skew-adjoint, and unitary operators are all normal.
Normal operators share the key property with self-adjoint ones:
All eigenvalues of a self-adjoint operator are real.
Let with . Then:
Since , we have , so .
Eigenvectors corresponding to distinct eigenvalues of a self-adjoint operator are orthogonal.
Let and with .
So . Since , we have .
We can always choose orthonormal eigenvectors for a self-adjoint operator.
If is self-adjoint and is a -invariant subspace, then is also -invariant.
Let . For any :
since (W is T-invariant) and . Thus , so .
Verify is self-adjoint:
Check : ✓ (symmetric)
Eigenvalues will be real, eigenvectors orthogonalizable.
Self-adjoint operators preserve angles in a specific sense:
The "shadow" of on equals the shadow of on .
A self-adjoint operator is positive semi-definite iff:
Equivalently: all eigenvalues are non-negative.
is positive definite if for all .
Equivalently: all eigenvalues are strictly positive.
Is positive definite?
Eigenvalues: (both positive) ✓
Or: ✓
Let be a self-adjoint operator on finite-dimensional inner product space . Then:
Equivalently, is orthogonally diagonalizable.
A real matrix is symmetric iff there exists orthogonal and diagonal with:
where columns of are orthonormal eigenvectors and has real eigenvalues.
Diagonalize .
Eigenvalues:
Eigenvectors:
Diagonalization:
Proof of Spectral Theorem: By induction on dimension. Base case: is trivial.
Induction: Let be an eigenvalue of (exists over , real for self-adjoint).
Let be a unit eigenvector for . Set .
By the Invariant Subspace Theorem, is -invariant.
Apply induction to : get orthonormal eigenbasis for .
Together with , this gives orthonormal eigenbasis for .
Orthogonally diagonalize :
Step 1: Characteristic polynomial:
Eigenvalues:
Step 2: Find eigenvectors:
Step 3: Form Q and D.
A linear operator on a complex inner product space is normal iff has an orthonormal basis of eigenvectors of .
If , then:
where applies to diagonal entries.
For with :
A self-adjoint operator can be written as:
where are distinct eigenvalues and is the orthogonal projection onto the -eigenspace.
For with :
For any function defined on the spectrum of :
This allows computing matrix functions like , , .
Compute for :
Eigenvalues:
Compute for positive definite :
For non-diagonal:
The resolvent for not an eigenvalue is:
The spectral radius of is:
For normal operators: .
Observables are self-adjoint. Real eigenvalues = measurement outcomes. Eigenstates are orthogonal.
Covariance matrix is symmetric. Eigenvectors give principal components for dimensionality reduction.
Stiffness/mass matrices are symmetric. Eigenvalues give natural frequencies, eigenvectors give mode shapes.
Classify using eigenvalue signs. Positive definite iff all eigenvalues positive.
In quantum mechanics, the energy observable is represented by the Hamiltonian (self-adjoint).
Given data matrix , the covariance matrix is symmetric positive semi-definite.
Spectral decomposition :
Two masses connected by springs: where is symmetric.
Natural frequencies: (eigenvalues of )
Mode shapes: eigenvectors describe how masses move together.
Self-adjoint operators guarantee:
An operator is normal iff:
normal ⟹
Unitary operators (like rotations) satisfy .
They are normal with (eigenvalues on unit circle).
Eigenvalues:
| Type | Definition | Eigenvalues |
|---|---|---|
| Self-adjoint | Real | |
| Skew-adjoint | Purely imaginary | |
| Unitary | On unit circle | |
| Projection | 0 or 1 |
Two self-adjoint operators can be simultaneously diagonalized (share orthonormal eigenbasis) iff they commute: .
In quantum mechanics, position and momentum don't commute (), so they can't be simultaneously measured precisely (Heisenberg uncertainty).
Only symmetric/Hermitian matrices are! Non-symmetric matrices may not even be diagonalizable.
For A = QDQᵀ, Q must be orthogonal. This requires normalizing each eigenvector to unit length.
Symmetric: A = Aᵀ (real). Hermitian: A = Āᵀ (complex, requires conjugate).
If eigenvalue has multiplicity >1, apply Gram-Schmidt to get orthonormal basis for that eigenspace.
Normal means TT* = T*T. Self-adjoint is the special case T = T*. Unitary and skew-adjoint are also normal.
When diagonalizing symmetric matrices:
Orthogonally diagonalize :
Step 1: Check symmetry: ✓
Step 2: Eigenvalues:
Step 3: Eigenvectors:
Step 4: Result:
Diagonalize :
Eigenvalues: (multiplicity 1), (from lower-right block)
Note: has multiplicity 2 but the eigenspace is 2-dimensional.
Eigenvectors for : and
Classify :
Matrix form:
Eigenvalues: (both positive)
Conclusion: is positive definite.
In principal axes: where are rotated coordinates.
Compute for positive definite :
For non-diagonal A:
SVD generalizes spectral theorem to non-square matrices. A = UΣVᵀ uses orthonormal bases.
Spectral theorem classifies quadratic forms. Sign of eigenvalues determines definiteness.
Every matrix A = UP where U is unitary and P is positive semi-definite.
Spectral theorem extends to compact self-adjoint operators on Hilbert spaces.
Any invertible operator can be written as:
where is unitary and is positive definite.
The spectral theorem is one of the most important results in mathematics because:
The spectral theorem has roots in the work of Cauchy, Hermite, and Hilbert. The term "spectrum" was introduced by David Hilbert around 1900, inspired by the discrete frequencies (spectral lines) observed in atomic spectra.
| T* | Adjoint of T |
| Aᵀ | Transpose (real) |
| A* | Conjugate transpose (complex) |
| Pᵢ | Projection onto λᵢ-eigenspace |
| ρ(T) | Spectral radius |
Orthogonally diagonalize :
Solution outline:
Show that is NOT orthogonally diagonalizable:
Solution: A ≠ Aᵀ (not symmetric), so spectral theorem doesn't apply.
A is diagonalizable (distinct eigenvalues), but not orthogonally diagonalizable.
Determine if is positive definite:
Solution: Find eigenvalues of the 2×2 block: λ = 5, 0
Since one eigenvalue is 0, B is positive semi-definite, not positive definite.
For rotation (90° rotation):
(a) Show R is normal but not self-adjoint
(b) Find eigenvalues (complex)
(c) Verify eigenvalues are on unit circle
For mastery of spectral theorem:
You've mastered the spectral theorem when you can:
Every square matrix is unitarily similar to an upper triangular matrix:
where is unitary and is upper triangular with eigenvalues on diagonal.
Compare the two decompositions:
For self-adjoint with spectral decomposition :
This gives an alternative proof of Cayley-Hamilton for self-adjoint operators.
For with :
Characteristic polynomial:
Via spectral: ✓
For self-adjoint and nonzero , the Rayleigh quotient is:
For self-adjoint with eigenvalues :
The extrema are achieved at the corresponding eigenvectors.
For :
Rayleigh quotient at :
Rayleigh quotient at : (max eigenvalue)
Rayleigh quotient at : (min eigenvalue)
The k-th eigenvalue can be characterized as:
The min-max principle is used in:
The spectral theorem extends to infinite-dimensional Hilbert spaces:
A compact self-adjoint operator on Hilbert space has:
where and are finite-rank projections onto eigenspaces.
The integral operator with symmetric kernel is compact self-adjoint.
Its eigenvalues and eigenfunctions solve:
The infinite-dimensional spectral theorem is fundamental to:
| Algorithm | Complexity | Best For |
|---|---|---|
| Power iteration | per iter | Largest eigenvalue only |
| QR algorithm | All eigenvalues | |
| Jacobi method | High accuracy | |
| Divide-and-conquer | Tridiagonal | |
| Lanczos | Sparse, few eigenvalues |
To find largest eigenvalue of symmetric :
For symmetric matrices, eigenvalue algorithms are:
Before diagonalization, reduce to tridiagonal form:
This costs but makes subsequent steps much faster.
For graph with adjacency matrix and degree matrix :
is symmetric positive semi-definite with eigenvalues .
For path graph (3 vertices in a line):
Eigenvalues:
The graph Laplacian is used in:
The Spectral Theorem is one of the most beautiful and useful results in linear algebra. It says that symmetric/Hermitian operators are completely determined by their eigenvalues and can be decomposed into simple, orthogonal pieces.
A = QDQᵀ reveals the "principal axes" of a linear transformation.
Powers, functions, and systems become easy in the eigenbasis.
From quantum mechanics to machine learning, the spectral theorem is everywhere.
The spectral theorem leads naturally to:
The 'spectrum' of an operator is its set of eigenvalues. The spectral theorem describes how the operator decomposes into eigenspaces—like white light splitting into its spectral colors.
If Av = λv with A symmetric, then λ||v||² = ⟨Av, v⟩ = ⟨v, Av⟩ = ⟨v, λv⟩ = λ̄||v||². So λ = λ̄, meaning λ is real.
For real matrices with standard inner product, they're the same: T* = Tᵀ. For complex matrices, adjoint means conjugate transpose: T* = T̄ᵀ = Tᴴ.
It means A = QDQᵀ where Q⁻¹ = Qᵀ (easy to compute!). Powers become Aⁿ = QDⁿQᵀ. Functions like eᴬ are easy to compute.
They may not be orthogonally diagonalizable, or even diagonalizable at all. Normal matrices (TT* = T*T) are the largest class with orthonormal eigenbases.
Observables are self-adjoint operators. The spectral theorem guarantees real measurement outcomes (eigenvalues) and orthogonal states (eigenvectors).
For xᵀAx with symmetric A, diagonalization gives Σλᵢyᵢ² in new coordinates. Sign of eigenvalues determines if form is positive/negative definite.
For self-adjoint operators: yes! That's the spectral theorem. For non-normal operators: generally no.
Pᵢ = (1/||vᵢ||²)vᵢvᵢᵀ projects onto eigenspace for λᵢ. They satisfy Pᵢ² = Pᵢ, PᵢPⱼ = 0 for i≠j, and ΣPᵢ = I.
For orthogonal Q, Qᵀ = Q⁻¹, so they're the same! The Qᵀ notation emphasizes that Q is orthogonal and no matrix inversion is needed.