Eigenvalues and eigenvectors reveal the fundamental behavior of linear transformations. They identify directions that remain invariant (up to scaling) under the transformation, providing deep insight into the structure of linear operators.
This chapter introduces one of the most important concepts in linear algebra, with applications ranging from differential equations to quantum mechanics, data science, and computer graphics.
Our goal in studying linear transformations is to find simple matrix representations. The simplest form is a diagonal matrix. This naturally leads us to study invariant subspaces, particularly one-dimensional ones where for some scalar .
Let be a linear operator on a vector space . A scalar is an eigenvalue (characteristic value) of if there exists a non-zero vector such that:
The non-zero vector is called an eigenvector (characteristic vector) belonging to eigenvalue .
Eigenvectors must be non-zero by definition. Otherwise, the zero vector would satisfy for any , making the concept meaningless. However, is a valid eigenvalue—it simply means , so eigenvectors for are exactly the non-zero vectors in .
Let be an matrix. A scalar is an eigenvalue of if there exists a non-zero vector such that:
The non-zero vector is called an eigenvector of belonging to .
If is the matrix representation of under basis , and , then:
The eigenvalue is the same for both, but the eigenvector is the coordinate representation of in the chosen basis.
For eigenvalue of operator , the eigenspace is:
This set includes the zero vector and all eigenvectors for .
is a subspace of and is invariant under .
Subspace: First, since .
For and scalars :
So , proving closure.
Invariance: For any , we have .
The dimension of is at least 1 (when is an eigenvalue) but can be larger. If , there are multiple linearly independent eigenvectors for the same eigenvalue.
The spectrum of (or ) is the set of all eigenvalues:
An eigenvector lies on an invariant line: the transformation maps to a scalar multiple of itself. Geometrically:
For :
Key insight: For any diagonal matrix, the eigenvalues are the diagonal entries, and the standard basis vectors are eigenvectors.
For upper triangular :
Eigenvalues are (the diagonal entries).
However, the eigenvectors are NOT simply the standard basis vectors (unlike diagonal matrices).
For 90° rotation in :
No real eigenvalues! Geometrically, no non-zero vector stays on its own line after 90° rotation.
Over : with eigenvectors .
For projection onto the x-axis:
Given with non-zero such that . Find .
Solution: Since 2 is an eigenvalue, :
Therefore .
For identity : eigenvalue is with multiplicity . Every non-zero vector is an eigenvector.
For scalar matrix : eigenvalue is with multiplicity .
Reflection across the line :
The following are equivalent for :
(1) ⇒ (2): If is an eigenvalue, there exists with . Thus , so , meaning is not injective.
(2) ⇔ (3) ⇔ (4): For operators on finite-dimensional spaces, injective ⇔ surjective ⇔ invertible.
(4) ⇔ (5): A matrix is invertible iff its determinant is non-zero.
The characteristic polynomial of matrix is:
Its roots are exactly the eigenvalues of .
Some texts use instead of . They differ by a factor of but have the same roots.
For :
If are eigenvalues (with multiplicity):
Find eigenvalues of :
Eigenvalues: .
Verification: tr(A) = 4 + 1 = 5 = 2 + 3 ✓, det(A) = 4 - (-2) = 6 = 2 × 3 ✓
For , solve :
Row reducing: , so eigenvector .
For : eigenvector .
Eigenvectors corresponding to distinct eigenvalues are linearly independent.
We prove by induction on the number of distinct eigenvalues .
Base case (k=1): A single eigenvector is linearly independent.
Inductive step: Assume true for . Suppose where with distinct .
Apply :
Multiply original by :
Subtract:
By induction hypothesis and , we get , hence .
An matrix has at most distinct eigenvalues.
If (A and B are similar), then A and B have the same characteristic polynomial, hence the same eigenvalues (with multiplicities).
If and is an eigenvector of for , then is an eigenvector of for the same .
If is an eigenvalue of , then is also an eigenvalue of .
Let with . Then .
We need : If , then . Since , this implies , contradiction.
If is an eigenvalue of with eigenvector , then:
In all cases, remains the eigenvector.
For (2):
For (3):
If is invertible with eigenvalue (necessarily ):
The eigenvector is unchanged.
From : , so .
Since when is invertible: .
and have the same eigenvalues (but generally different eigenvectors).
Based on eigenvalue constraints from matrix equations:
If (idempotent), and :
So .
Given 3×3 matrix with eigenvalues 1, -2, -1. Find:
Eigenvectors must be non-zero by definition. The zero vector trivially satisfies for any .
is valid when . This means is singular (non-invertible).
Algebraic = root multiplicity in characteristic polynomial. Geometric = dim(eigenspace). Always: geometric ≤ algebraic.
FALSE! Similar matrices have same eigenvalues, but eigenvectors differ by the change-of-basis matrix.
FALSE over ! Rotation matrices often have no real eigenvalues. Over , every matrix has eigenvalues.
Solutions to involve where are eigenvalues.
System stable iff all (discrete) or (continuous).
Eigenvalues of covariance matrix reveal variance along principal directions.
Observable quantities are eigenvalues of Hermitian operators.
Page importance is the eigenvector for eigenvalue 1 of the web link matrix.
Natural frequencies of vibrating systems are related to eigenvalues of mass-stiffness matrices.
The algebraic multiplicity of eigenvalue is its multiplicity as a root of the characteristic polynomial .
The geometric multiplicity of eigenvalue is , the number of linearly independent eigenvectors.
For any eigenvalue :
For :
Characteristic polynomial:
Eigenvalue with algebraic multiplicity = 2
Eigenspace:
Geometric multiplicity = 1 < 2 = algebraic multiplicity
When geometric multiplicity < algebraic multiplicity, the eigenvalue is called defective. Such matrices cannot be diagonalized.
For :
Same characteristic polynomial:
But , so geometric multiplicity = 2
This matrix IS diagonalizable (it's already diagonal!).
Find all eigenvalues and eigenvectors of .
Step 1: Characteristic polynomial
Step 2: Eigenvalues:
Step 3: Eigenvectors for
Eigenvector:
Step 4: Eigenvectors for
Eigenvector:
Verification: tr(A) = 3 = 4 + (-1) ✓, det(A) = -4 = 4 × (-1) ✓
For where :
Key observation: rank(A) = 1, so dim(ker(A)) = 2, meaning has algebraic multiplicity at least 2.
Since tr(A) = 0 and we need 3 eigenvalues summing to 0 with two being 0, the third is also 0.
Actually, , so .
This means A is nilpotent, so all eigenvalues are 0.
For where :
Eigenvalues of A = eigenvalues of B ∪ eigenvalues of C = {1, 3, 5}
Find if is 3×3 with and .
Step 1: From , eigenvalues satisfy
So
Step 2: Since and A is 3×3, eigenvalue product = 2
Possible: ✓
So eigenvalues are -1, -1, 2
Step 3: Eigenvalues of :
Step 4: Eigenvalues of :
Answer:
Every eigenvalue of lies in at least one Gershgorin disk:
Each disk is centered at a diagonal entry with radius equal to the sum of absolute values of off-diagonal entries in that row.
For :
All eigenvalues lie in .
If for all (strictly diagonally dominant), then all Gershgorin disks exclude 0, so the matrix is invertible.
Problem 1
If A is a 4×4 matrix with det(A) = -12 and eigenvalues 1, 2, λ, μ, find λμ.
Answer: λμ = -12/(1×2) = -6
Problem 2
Prove: If A² = 0, then all eigenvalues of A are 0.
Hint: If Av = λv, then A²v = λ²v = 0, so λ² = 0.
Problem 3
Find the eigenvalues of (cyclic permutation).
Hint: Note A³ = I. Eigenvalues satisfy λ³ = 1.
Problem 4
If A has eigenvalue 3 with eigenvector v, what are the eigenvalues and eigenvectors of 2A - 5I?
Answer: Eigenvalue 2(3) - 5 = 1, same eigenvector v.
Problem 5 (Challenge)
Prove: A and Aᵀ have the same eigenvalues but generally different eigenvectors.
This module introduced eigenvalues and eigenvectors—the characteristic values and invariant directions of linear transformations. The eigenvalue equation leads to the characteristic polynomial for finding eigenvalues.
11
Theorems
18
Examples
12
Quiz Questions
10
FAQs
,
Solve
,
Over , every × matrix has exactly eigenvalues (counting multiplicity).
A real matrix may have:
For a real matrix, if is an eigenvalue with eigenvector , then is also an eigenvalue with eigenvector .
For rotation by angle :
Characteristic polynomial:
Solutions:
A real symmetric matrix has:
Real symmetric matrices are extremely important in applications:
For matrices larger than 4×4, the characteristic polynomial approach is impractical because:
To find the largest eigenvalue of :
Converges if (dominant eigenvalue is unique).
A is invertible iff no eigenvalue is 0.
Sum of diagonal = sum of eigenvalues.
rank(A) = n - (# of zero eigenvalues)
For diagonalizable matrices.
Spectral norm = largest singular value.
Measures numerical stability.
For diagonalizable A = PDP⁻¹.
| Matrix Type | Eigenvalues | Notes |
|---|---|---|
| Identity I | All 1 | Every vector is an eigenvector |
| Zero matrix | All 0 | Every vector is an eigenvector |
| Diagonal | Diagonal entries | Standard basis are eigenvectors |
| Triangular | Diagonal entries | Eigenvectors may differ from standard basis |
| Projection (P² = P) | 0 or 1 only | E₁ = image, E₀ = kernel |
| Reflection (P² = I) | ±1 only | E₁ = mirror, E₋₁ = perpendicular |
| Rotation (2D) | Real only for θ = 0, π | |
| Nilpotent (Aᵏ = 0) | All 0 | Not diagonalizable if A ≠ 0 |
Etymology: "Eigen" is German for "own" or "characteristic." Eigenvalues are the "characteristic values" of a transformation.
Euler (1750s): Studied principal axes of rotation of rigid bodies, which are eigenvectors of the inertia tensor.
Cauchy (1826): First systematic study of eigenvalues in the context of quadratic forms and symmetric matrices. Proved that real symmetric matrices have real eigenvalues.
Sylvester (1852): Coined the term "matrix" and studied eigenvalues in the context of invariant theory.
Cayley (1858): Introduced matrices as algebraic objects and proved the Cayley-Hamilton theorem.
Hilbert (1904): Extended eigenvalue theory to infinite-dimensional spaces, founding spectral theory.
Modern Era: Eigenvalue computation became crucial with computers. The QR algorithm (1960s) remains the standard method.
If for some polynomial , then every eigenvalue of satisfies .
If with , then .
Since , we have .
Since , we must have .
If , what are the possible eigenvalues?
Solution: Eigenvalues satisfy
Factor:
Possible eigenvalues:
Given 3×3 matrix with , , and .
Step 1: From , eigenvalues ∈ {0, 1}
Step 2: From , at least one eigenvalue is 0
Step 3: From , eigenvalues sum to 2
Conclusion: Eigenvalues must be 0, 1, 1
The characteristic polynomial satisfies (Cayley-Hamilton theorem). But there may be lower-degree polynomials that also annihilate .
For any matrix and induced norm :
Every eigenvalue has absolute value at most the matrix norm.
If with :
Dividing by :
Using the infinity norm:
Using the 1-norm:
For :
Row sums: 3, 5, 3. So
Column sums: 3, 5, 3. Same bound.
Gershgorin disks: [1,3], [1,5], [1,3]. Union: [1,5].
The spectral radius of is:
A symmetric matrix is positive definite if all eigenvalues are positive, equivalently if for all .
For symmetric matrices:
Challenge 1
Prove: If A is nilpotent (Aᵏ = 0 for some k), then tr(A) = det(A) = 0.
Challenge 2
If A and B are n×n matrices with AB = BA, and v is an eigenvector of A, prove that Bv is also an eigenvector of A (for the same eigenvalue) or Bv = 0.
Challenge 3
Prove: If A is a real matrix with all real eigenvalues, it doesn't necessarily mean A is symmetric. Give a counterexample.
Challenge 4
For 3×3 matrix A with eigenvalues 1, 2, 3, find det(A³ - 6A² + 11A - 6I).
Hint: Factor the polynomial using the eigenvalues.
You are here! Eigenvalues are the foundation for understanding matrix structure and behavior.
With eigenvalue fundamentals mastered, you're ready for:
For :
Characteristic Polynomial:
Eigenvalues:
Sum:
Product:
Congratulations! You've completed the comprehensive introduction to eigenvalues and eigenvectors. This foundational knowledge prepares you for the rich theory of matrix analysis ahead.
17
Theorems
23
Examples
12
Quiz Questions
10
FAQs
16
Sections
An eigenvector's direction is preserved by the linear map. It may be stretched (|λ| > 1), shrunk (|λ| < 1), or flipped (λ < 0), but its direction stays the same. Eigenvectors define 'invariant directions' of the transformation.
Yes! Over ℂ, every n×n matrix has exactly n eigenvalues (counting multiplicity) by the Fundamental Theorem of Algebra. Over ℝ, some matrices have no real eigenvalues, like 90° rotation matrices.
No! By definition, eigenvectors must be non-zero. The zero vector satisfies Av = λv for all λ, so it would make the definition meaningless. However, 0 is a valid eigenvalue!
Then there are multiple linearly independent eigenvectors for that eigenvalue. The dimension of the eigenspace is called the geometric multiplicity. It's always ≤ the algebraic multiplicity (root multiplicity in characteristic polynomial).
Product of all eigenvalues (with multiplicity) = det(A). Sum of all eigenvalues (with multiplicity) = trace(A) = sum of diagonal elements. These follow from the characteristic polynomial.
Eigenvalues reveal fundamental properties: invertibility (no zero eigenvalue), stability (all |λ| < 1), growth rates, and enable diagonalization for easier computation of matrix powers and exponentials.
Algebraic multiplicity = number of times λ appears as root of det(A - λI) = 0. Geometric multiplicity = dim(ker(A - λI)) = number of linearly independent eigenvectors. Always: geometric ≤ algebraic.
They have the same nonzero eigenvalues! If λ ≠ 0 is an eigenvalue of AB, it's also an eigenvalue of BA. The zero eigenvalue may differ in multiplicity.
If λ is an eigenvalue of A with eigenvector v, then λⁿ is an eigenvalue of Aⁿ with the same eigenvector v. This follows from Aⁿv = Aⁿ⁻¹(Av) = Aⁿ⁻¹(λv) = λAⁿ⁻¹v = ... = λⁿv.
The eigenvalues of a triangular (upper or lower) matrix are exactly its diagonal entries. This is because det(A - λI) factors as a product of (aᵢᵢ - λ) terms.