Inner product spaces add geometric structure to vector spaces, enabling concepts of length, angle, and orthogonality. This course covers inner products, orthonormal bases, projections, least squares approximation, and the fundamental spectral theorem for symmetric matrices.
The concept of inner products evolved from the dot product in Euclidean geometry. Augustin-Louis Cauchy (1789–1857) proved the Cauchy-Schwarz inequality in 1821. Jørgen Gram (1850–1916) and Erhard Schmidt (1876–1959) developed the orthogonalization process independently. The spectral theorem, one of the most important results in linear algebra, was proven by David Hilbert (1862–1943) and others in the early 20th century. Inner product spaces, especially Hilbert spaces, are fundamental in quantum mechanics, signal processing, and functional analysis.
An inner product is a function that assigns a scalar to each pair of vectors, generalizing the dot product and enabling geometric concepts like length and angle.
An inner product on a vector space over is a function satisfying:
The norm induced by an inner product is .
For any vectors in an inner product space:
Equality holds if and only if and are linearly dependent.
Orthogonality generalizes perpendicularity. Orthonormal bases provide the most convenient coordinate systems for computation.
Vectors are orthogonal if .
A set is orthonormal if (Kronecker delta).
An orthogonal set of nonzero vectors is linearly independent.
If is an orthonormal basis, then for any :
These are called Fourier coefficients.
For an orthonormal basis, where .
The Gram-Schmidt process converts any basis into an orthonormal basis, proving that every finite-dimensional inner product space has an orthonormal basis.
Given linearly independent vectors :
Result: is orthonormal with .
If has linearly independent columns, then where has orthonormal columns (from Gram-Schmidt) and is upper triangular.
For in :
The orthogonal projection of a vector onto a subspace is the closest point in that subspace. This is fundamental for least squares approximation.
The orthogonal projection of onto subspace is the unique vector such that .
If is an orthonormal basis for , then:
minimizes over all .
For (overdetermined), the least squares solution minimizes .
The least squares solution satisfies (the normal equation).
For data points , fitting minimizes . This is least squares with .
The spectral theorem is one of the most important results in linear algebra, characterizing self-adjoint operators and enabling orthogonal diagonalization of symmetric matrices.
An operator on an inner product space is self-adjoint if , where is the adjoint satisfying .
For real matrices with standard inner product, self-adjoint means symmetric ().
All eigenvalues of a self-adjoint operator are real.
Eigenvectors corresponding to distinct eigenvalues of a self-adjoint operator are orthogonal.
Every self-adjoint operator on a finite-dimensional inner product space has an orthonormal basis of eigenvectors. Equivalently, every real symmetric matrix is orthogonally diagonalizable: where is orthogonal and is diagonal with real entries.
If with eigenvalues and orthonormal eigenvectors , then:
This is the spectral decomposition.
The spectral theorem is fundamental in quantum mechanics (observables are self-adjoint), principal component analysis, and many areas of applied mathematics.
Inner products measure 'angle' and 'length'. ||v|| = √⟨v,v⟩ gives length. cos θ = ⟨x,y⟩/(||x||·||y||) defines angle. Orthogonality (⟨x,y⟩ = 0) means perpendicular.
Three main reasons: (1) Coefficients are trivial to compute as inner products, no matrix inversion needed. (2) The Gram matrix is the identity, simplifying all computations. (3) Parseval's identity gives ||v||² = Σ|cᵢ|², preserving norms.
Start with first vector, normalize it. For each subsequent vector, subtract its projections onto all previous orthonormal vectors, then normalize. This makes each new vector orthogonal to all previous ones.
When Ax = b has no solution, find x that minimizes ||Ax - b||. This is equivalent to finding the projection of b onto col(A). The solution satisfies the normal equation AᵀAx = Aᵀb.
Every self-adjoint (symmetric/Hermitian) operator has an orthonormal basis of eigenvectors with real eigenvalues. This means A = QDQᵀ where Q is orthogonal and D is diagonal with real entries.