MathIsimple
LA-6.2
Available
Core Topic

Characteristic Polynomial

The characteristic polynomial encodes all eigenvalues and their multiplicities in a single polynomial expression.

This fundamental object connects eigenvalue theory to polynomial algebra, providing computational methods for finding eigenvalues and deep theoretical insights about matrix structure.

Learning Objectives
  • Define the characteristic polynomial for matrices and linear operators
  • Compute eigenvalues by solving the characteristic equation det(A - λI) = 0
  • Understand algebraic multiplicity as root multiplicity in char poly
  • Understand geometric multiplicity as dimension of eigenspace
  • Prove the inequality: geometric ≤ algebraic multiplicity
  • Apply Vieta's formulas: sum = trace, product = determinant
  • Understand implications of the Fundamental Theorem of Algebra
  • Compute characteristic polynomials for special matrices (triangular, block)
Prerequisites
  • Eigenvalue and eigenvector definitions (LA-6.1)
  • Determinants and their properties (LA-5.1-5.5)
  • Polynomial factorization and roots
  • Complex numbers and conjugate pairs

1. Characteristic Polynomial

In the previous section, we saw that eigenvalues are scalars λ\lambda for which AλIA - \lambda Iis singular. This condition det(AλI)=0\det(A - \lambda I) = 0 defines a polynomial equation whose roots are precisely the eigenvalues.

Definition 6.3: Characteristic Polynomial (Matrix)

For AMn(F)A \in M_n(F), the characteristic polynomial is:

χA(λ)=det(λIA)\chi_A(\lambda) = \det(\lambda I - A)

This is a monic polynomial of degree nn in λ\lambda.

Remark 6.5: Convention

Some texts define χA(λ)=det(AλI)\chi_A(\lambda) = \det(A - \lambda I) instead. The two conventions differ by a factor of (1)n(-1)^n, but have exactly the same roots. We use det(λIA)\det(\lambda I - A) to ensure the polynomial is monic (leading coefficient 1).

Definition 6.4: Characteristic Polynomial (Linear Operator)

For a linear operator TL(V)T \in L(V) on finite-dimensional space VV, the characteristic polynomial is defined as the characteristic polynomial of any matrix representation:

χT(λ)=det(λIA)\chi_T(\lambda) = \det(\lambda I - A)

where AA is the matrix of TT in any basis. This is well-defined since similar matrices have the same characteristic polynomial.

Theorem 6.6: Eigenvalue Characterization

λ0\lambda_0 is an eigenvalue of AA if and only if χA(λ0)=0\chi_A(\lambda_0) = 0.

Proof:

λ0\lambda_0 is an eigenvalue ⟺ v0:Av=λ0v\exists v \neq 0: Av = \lambda_0 v

(Aλ0I)v=0(A - \lambda_0 I)v = 0 has non-trivial solution

Aλ0IA - \lambda_0 I is singular

det(Aλ0I)=0\det(A - \lambda_0 I) = 0

χA(λ0)=0\chi_A(\lambda_0) = 0

Example 6.3: 2×2 Example

For A=(4213)A = \begin{pmatrix} 4 & 2 \\ 1 & 3 \end{pmatrix}:

χA(λ)=det(λ421λ3)=(λ4)(λ3)2=λ27λ+10=(λ2)(λ5)\chi_A(\lambda) = \det\begin{pmatrix} \lambda - 4 & -2 \\ -1 & \lambda - 3 \end{pmatrix} = (\lambda - 4)(\lambda - 3) - 2 = \lambda^2 - 7\lambda + 10 = (\lambda-2)(\lambda-5)

Eigenvalues: λ1=2\lambda_1 = 2, λ2=5\lambda_2 = 5.

Check: tr(A) = 7 = 2 + 5 ✓, det(A) = 10 = 2 × 5 ✓

Example 6.4: 3×3 Example

For A=(210021002)A = \begin{pmatrix} 2 & 1 & 0 \\ 0 & 2 & 1 \\ 0 & 0 & 2 \end{pmatrix} (upper triangular):

χA(λ)=(λ2)3\chi_A(\lambda) = (\lambda - 2)^3

Single eigenvalue λ=2\lambda = 2 with algebraic multiplicity 3.

2. Algebraic and Geometric Multiplicity

When an eigenvalue appears as a repeated root of the characteristic polynomial, we need to distinguish between how many times it appears as a root versus how many independent eigenvectors it has.

Definition 6.5: Algebraic Multiplicity

The algebraic multiplicity of eigenvalue λ0\lambda_0, denoted a(λ0)a(\lambda_0) or ma(λ0)m_a(\lambda_0), is the largest integer kk such that (λλ0)k(\lambda - \lambda_0)^k divides χA(λ)\chi_A(\lambda).

Equivalently, it's the exponent of (λλ0)(\lambda - \lambda_0) in the factored form of χA\chi_A.

Definition 6.6: Geometric Multiplicity

The geometric multiplicity of eigenvalue λ0\lambda_0, denoted g(λ0)g(\lambda_0) or mg(λ0)m_g(\lambda_0), is:

g(λ0)=dim(Eλ0)=dimker(Aλ0I)=nrank(Aλ0I)g(\lambda_0) = \dim(E_{\lambda_0}) = \dim\ker(A - \lambda_0 I) = n - \text{rank}(A - \lambda_0 I)

This is the number of linearly independent eigenvectors for λ0\lambda_0.

Theorem 6.7: Multiplicity Inequality

For every eigenvalue λ0\lambda_0:

1g(λ0)a(λ0)1 \leq g(\lambda_0) \leq a(\lambda_0)
Proof:

Lower bound (g1g \geq 1): If λ0\lambda_0 is an eigenvalue, there exists at least one eigenvector, so g1g \geq 1.

Upper bound (gag \leq a): Let g=dimEλ0g = \dim E_{\lambda_0}. Choose a basis v1,,vgv_1, \ldots, v_g of Eλ0E_{\lambda_0} and extend to a basis of FnF^n.

In this basis, AA has the form:

A=(λ0IgB0C)A = \begin{pmatrix} \lambda_0 I_g & B \\ 0 & C \end{pmatrix}

Then χA(λ)=(λλ0)gχC(λ)\chi_A(\lambda) = (\lambda - \lambda_0)^g \cdot \chi_C(\lambda), so a(λ0)ga(\lambda_0) \geq g.

Example 6.5: Equal Multiplicities

For A=(3003)A = \begin{pmatrix} 3 & 0 \\ 0 & 3 \end{pmatrix}:

χA(λ)=(λ3)2\chi_A(\lambda) = (\lambda - 3)^2, so a(3)=2a(3) = 2

E3=ker(A3I)=ker(0)=R2E_3 = \ker(A - 3I) = \ker(0) = \mathbb{R}^2, so g(3)=2g(3) = 2

Here g=ag = a, and the matrix is diagonalizable (already diagonal!).

Example 6.6: Unequal Multiplicities (Defective Matrix)

For A=(3103)A = \begin{pmatrix} 3 & 1 \\ 0 & 3 \end{pmatrix}:

χA(λ)=(λ3)2\chi_A(\lambda) = (\lambda - 3)^2, so a(3)=2a(3) = 2

E3=ker(0100)=span{(1,0)T}E_3 = \ker\begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} = \text{span}\{(1, 0)^T\}, so g(3)=1g(3) = 1

Here g<ag < a. This matrix is defective and cannot be diagonalized.

Remark 6.6: Significance for Diagonalization

A matrix is diagonalizable if and only if g(λ)=a(λ)g(\lambda) = a(\lambda) for every eigenvalue. When g<ag < a, there aren't enough eigenvectors to form a basis.

Corollary 6.2: Sum of Algebraic Multiplicities

The sum of all algebraic multiplicities equals nn (the size of the matrix):

ia(λi)=n\sum_{i} a(\lambda_i) = n
Corollary 6.3: Sum of Geometric Multiplicities

The sum of all geometric multiplicities is at most nn:

ig(λi)n\sum_{i} g(\lambda_i) \leq n

Equality holds if and only if the matrix is diagonalizable.

3. Coefficients and Vieta's Formulas

The coefficients of the characteristic polynomial have beautiful interpretations in terms of the matrix entries and eigenvalues.

Theorem 6.8: Characteristic Polynomial Coefficients

For A=(aij)Mn(F)A = (a_{ij}) \in M_n(F), write:

χA(λ)=λn+c1λn1+c2λn2++cn1λ+cn\chi_A(\lambda) = \lambda^n + c_1\lambda^{n-1} + c_2\lambda^{n-2} + \cdots + c_{n-1}\lambda + c_n

Then:

  • c1=tr(A)=(a11+a22++ann)c_1 = -\text{tr}(A) = -(a_{11} + a_{22} + \cdots + a_{nn})
  • cn=(1)ndet(A)c_n = (-1)^n \det(A)
  • ck=(1)kc_k = (-1)^k × (sum of all kk×kk principal minors)
Theorem 6.9: Vieta's Formulas for Matrices

If λ1,,λn\lambda_1, \ldots, \lambda_n are eigenvalues (with multiplicity):

i=1nλi=tr(A),i=1nλi=det(A)\sum_{i=1}^{n} \lambda_i = \text{tr}(A), \quad \prod_{i=1}^{n} \lambda_i = \det(A)

More generally:

  • iλi=tr(A)\sum_{i} \lambda_i = \text{tr}(A) (sum of eigenvalues = trace)
  • i<jλiλj\sum_{i < j} \lambda_i \lambda_j = sum of 2×2 principal minors
  • iλi=det(A)\prod_{i} \lambda_i = \det(A) (product of eigenvalues = determinant)
Proof:

Since χA(λ)=(λλ1)(λλ2)(λλn)\chi_A(\lambda) = (\lambda - \lambda_1)(\lambda - \lambda_2) \cdots (\lambda - \lambda_n), expanding and comparing coefficients:

Coefficient of λn1\lambda^{n-1}: (λ1++λn)=tr(A)-(\lambda_1 + \cdots + \lambda_n) = -\text{tr}(A)

Constant term: (1)nλ1λn=(1)ndet(A)(-1)^n \lambda_1 \cdots \lambda_n = (-1)^n \det(A)

Example 6.7: Using Vieta's Formulas

Given χA(λ)=λ36λ2+11λ6\chi_A(\lambda) = \lambda^3 - 6\lambda^2 + 11\lambda - 6, find eigenvalues, trace, and determinant.

Solution: Factor: (λ1)(λ2)(λ3)(\lambda - 1)(\lambda - 2)(\lambda - 3)

Eigenvalues: 1, 2, 3

tr(A) = 1 + 2 + 3 = 6 ✓ (check: c1=6-c_1 = 6)

det(A) = 1 × 2 × 3 = 6 ✓ (check: (1)3c3=6(-1)^3 c_3 = 6)

Example 6.8: Finding Missing Eigenvalue

A 3×3 matrix has eigenvalues 2, 5, and λ3\lambda_3. If tr(A) = 10, find λ3\lambda_3.

Solution: 2+5+λ3=10λ3=32 + 5 + \lambda_3 = 10 \Rightarrow \lambda_3 = 3

Remark 6.7: Practical Applications

These formulas are powerful for:

  • Checking eigenvalue computations
  • Finding missing eigenvalues when some are known
  • Determining if 0 is an eigenvalue (iff det = 0)
  • Quick estimates of eigenvalue properties

4. Characteristic Polynomials of Special Matrices

Theorem 6.10: Triangular Matrices

For triangular (upper or lower) matrix AA with diagonal entries a11,,anna_{11}, \ldots, a_{nn}:

χA(λ)=(λa11)(λa22)(λann)\chi_A(\lambda) = (\lambda - a_{11})(\lambda - a_{22}) \cdots (\lambda - a_{nn})

The eigenvalues are exactly the diagonal entries.

Proof:

For upper triangular AA, λIA\lambda I - A is also upper triangular. The determinant of a triangular matrix is the product of diagonal entries:

det(λIA)=(λa11)(λa22)(λann)\det(\lambda I - A) = (\lambda - a_{11})(\lambda - a_{22}) \cdots (\lambda - a_{nn})
Theorem 6.11: Block Diagonal Matrices

For block diagonal A=diag(A1,A2,,Ak)A = \text{diag}(A_1, A_2, \ldots, A_k):

χA(λ)=χA1(λ)χA2(λ)χAk(λ)\chi_A(\lambda) = \chi_{A_1}(\lambda) \cdot \chi_{A_2}(\lambda) \cdots \chi_{A_k}(\lambda)
Example 6.9: Block Diagonal

For A=(1200340000500006)A = \begin{pmatrix} 1 & 2 & 0 & 0 \\ 3 & 4 & 0 & 0 \\ 0 & 0 & 5 & 0 \\ 0 & 0 & 0 & 6 \end{pmatrix}:

χA(λ)=χ(1234)(λ)(λ5)(λ6)\chi_A(\lambda) = \chi_{\begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}}(\lambda) \cdot (\lambda - 5)(\lambda - 6)

Theorem 6.12: Similar Matrices

If B=P1APB = P^{-1}AP, then χB(λ)=χA(λ)\chi_B(\lambda) = \chi_A(\lambda).

Proof:
χB(λ)=det(λIP1AP)=det(P1(λIA)P)=det(P1)det(λIA)det(P)=χA(λ)\chi_B(\lambda) = \det(\lambda I - P^{-1}AP) = \det(P^{-1}(\lambda I - A)P) = \det(P^{-1})\det(\lambda I - A)\det(P) = \chi_A(\lambda)
Corollary 6.4: Invariance Under Similarity

Trace, determinant, and eigenvalues are similarity invariants—they are the same for all similar matrices.

5. Existence of Eigenvalues

Theorem 6.13: Fundamental Theorem of Algebra

Every polynomial of degree n1n \geq 1 with complex coefficients has exactly nn roots in C\mathbb{C} (counting multiplicity).

Corollary 6.5: Existence Over ℂ

Every nn×nn complex matrix has exactly nn eigenvalues (counting algebraic multiplicity).

Theorem 6.14: Complex Eigenvalues of Real Matrices

For a real matrix, complex eigenvalues occur in conjugate pairs: if λ=a+bi\lambda = a + bi is an eigenvalue, so is λˉ=abi\bar{\lambda} = a - bi.

Proof:

If χA(λ)=0\chi_A(\lambda) = 0 and AA is real, then χA(λˉ)=χA(λ)=0\chi_A(\bar{\lambda}) = \overline{\chi_A(\lambda)} = 0.

Example 6.10: No Real Eigenvalues

The rotation matrix R=(0110)R = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix} (90° rotation):

χR(λ)=λ2+1=(λi)(λ+i)\chi_R(\lambda) = \lambda^2 + 1 = (\lambda - i)(\lambda + i)

Eigenvalues: ±i\pm i (complex conjugate pair, no real eigenvalues).

Remark 6.8: Real vs Complex Analysis

When working over R\mathbb{R}:

  • The characteristic polynomial may not factor completely
  • Some matrices have no real eigenvalues (e.g., rotations)
  • Complex eigenvalues come in conjugate pairs

When working over C\mathbb{C}:

  • Every matrix has eigenvalues
  • The char poly factors completely into linear terms

6. Computing the Characteristic Polynomial

2×2 Formula

For A=(abcd)A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}:

χA(λ)=λ2(a+d)λ+(adbc)=λ2tr(A)λ+det(A)\chi_A(\lambda) = \lambda^2 - (a+d)\lambda + (ad - bc) = \lambda^2 - \text{tr}(A)\lambda + \det(A)
Example 6.11: Step-by-Step 3×3 Computation

Compute χA\chi_A for A=(123045006)A = \begin{pmatrix} 1 & 2 & 3 \\ 0 & 4 & 5 \\ 0 & 0 & 6 \end{pmatrix}.

Method 1 (Direct): Since A is triangular:

χA(λ)=(λ1)(λ4)(λ6)\chi_A(\lambda) = (\lambda - 1)(\lambda - 4)(\lambda - 6)

Method 2 (Expansion): Expand det(λIA)\det(\lambda I - A) and verify.

Verification: tr(A) = 11 = 1+4+6 ✓, det(A) = 24 = 1×4×6 ✓

Remark 6.9: Computational Complexity

Computing the characteristic polynomial via determinant expansion:

  • 2×2: Simple formula, instant
  • 3×3: Cofactor expansion, straightforward
  • 4×4 and larger: Increasingly tedious; use software or specialized algorithms (Faddeev-LeVerrier, etc.)

7. Common Mistakes

Sign convention confusion

det(λIA)\det(\lambda I - A) vs det(AλI)\det(A - \lambda I) differ by (1)n(-1)^n. Both give the same eigenvalues, but keep conventions consistent.

Confusing multiplicities

Algebraic (root multiplicity) ≠ Geometric (eigenspace dimension) in general. They're equal only for diagonalizable matrices.

Same char poly ⇒ same matrix

FALSE! Different matrices can have identical characteristic polynomials. The char poly doesn't uniquely determine a matrix.

Forgetting multiplicity in Vieta

When computing trace/det from eigenvalues, count repeated eigenvalues multiple times according to their algebraic multiplicity.

8. Worked Examples

Example 6.12: Complete Analysis

For A=(211031003)A = \begin{pmatrix} 2 & 1 & 1 \\ 0 & 3 & 1 \\ 0 & 0 & 3 \end{pmatrix}, find eigenvalues and their multiplicities.

Step 1: Characteristic polynomial

Since A is upper triangular: χA(λ)=(λ2)(λ3)2\chi_A(\lambda) = (\lambda - 2)(\lambda - 3)^2

Step 2: Eigenvalues and algebraic multiplicities

  • λ1=2\lambda_1 = 2, a(2)=1a(2) = 1
  • λ2=3\lambda_2 = 3, a(3)=2a(3) = 2

Step 3: Geometric multiplicities

For λ=2\lambda = 2: g(2)=dimker(A2I)=dimker(011011001)=1g(2) = \dim\ker(A - 2I) = \dim\ker\begin{pmatrix} 0 & 1 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 1 \end{pmatrix} = 1

For λ=3\lambda = 3: g(3)=dimker(A3I)=dimker(111001000)=1g(3) = \dim\ker(A - 3I) = \dim\ker\begin{pmatrix} -1 & 1 & 1 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{pmatrix} = 1

Conclusion: For λ=3\lambda = 3: g=1<a=2g = 1 < a = 2, so A is NOT diagonalizable.

Example 6.13: Finding Char Poly from Properties

A 3×3 matrix has eigenvalues 1, 1, 2. Find the characteristic polynomial.

χA(λ)=(λ1)2(λ2)=λ34λ2+5λ2\chi_A(\lambda) = (\lambda - 1)^2(\lambda - 2) = \lambda^3 - 4\lambda^2 + 5\lambda - 2

Verify: tr(A) = 1+1+2 = 4 ✓ (coeff of λ2-\lambda^2)

det(A) = 1×1×2 = 2 ✓ (constant term with sign)

Example 6.14: Char Poly of a Projection

Let PP be a projection matrix (P2=PP^2 = P) of rank rr on Rn\mathbb{R}^n.

Eigenvalues satisfy λ2=λ\lambda^2 = \lambda, so λ{0,1}\lambda \in \{0, 1\}.

Since rank(P) = r, eigenvalue 1 has algebraic multiplicity r.

Since nullity(P) = n-r, eigenvalue 0 has algebraic multiplicity n-r.

χP(λ)=λnr(λ1)r\chi_P(\lambda) = \lambda^{n-r}(\lambda - 1)^r
Example 6.15: Using Matrix Equations

If A25A+6I=0A^2 - 5A + 6I = 0, find possible eigenvalues of AA.

Solution: If λ\lambda is an eigenvalue of AA, then:

λ25λ+6=0(λ2)(λ3)=0\lambda^2 - 5\lambda + 6 = 0 \Rightarrow (\lambda - 2)(\lambda - 3) = 0

Possible eigenvalues: λ{2,3}\lambda \in \{2, 3\}

Example 6.16: Companion Matrix

The companion matrix of polynomial p(λ)=λ36λ2+11λ6p(\lambda) = \lambda^3 - 6\lambda^2 + 11\lambda - 6:

C=(0061011016)C = \begin{pmatrix} 0 & 0 & 6 \\ 1 & 0 & -11 \\ 0 & 1 & 6 \end{pmatrix}

The characteristic polynomial of CC is exactly p(λ)p(\lambda)!

This is a key construction: any monic polynomial is the char poly of its companion matrix.

9. Preview: Minimal Polynomial

Definition 6.7: Minimal Polynomial

The minimal polynomial mA(λ)m_A(\lambda) is the monic polynomial of smallest degree such that mA(A)=0m_A(A) = 0.

Theorem 6.15: Minimal vs Characteristic

The minimal polynomial:

  • Divides the characteristic polynomial
  • Has the same roots (eigenvalues) as the char poly
  • May have different multiplicities
  • Has degree ≤ n
Example 6.17: When They Differ

For A=InA = I_n:

  • Characteristic polynomial: χA(λ)=(λ1)n\chi_A(\lambda) = (\lambda - 1)^n
  • Minimal polynomial: mA(λ)=λ1m_A(\lambda) = \lambda - 1

The minimal polynomial has degree 1 while the char poly has degree n.

Remark 6.10: Significance

The minimal polynomial is important because:

  • It determines when a matrix is diagonalizable (iff minimal poly has no repeated roots)
  • It gives the simplest polynomial relation satisfied by the matrix
  • It's key to the Jordan normal form theory

10. Applications

Stability Analysis

The characteristic polynomial roots determine system stability: stable if all roots have negative real part (continuous) or magnitude < 1 (discrete).

Recurrence Relations

Linear recurrences like Fibonacci have solutions determined by roots of a characteristic polynomial.

Differential Equations

Solutions to x=Ax\mathbf{x}' = A\mathbf{x} involve eλte^{\lambda t} terms for each eigenvalue λ\lambda.

Graph Theory

The characteristic polynomial of the adjacency matrix encodes graph properties like number of spanning trees.

Example 6.18: Fibonacci Recurrence

The Fibonacci sequence Fn=Fn1+Fn2F_n = F_{n-1} + F_{n-2} has matrix form:

(Fn+1Fn)=(1110)n(10)\begin{pmatrix} F_{n+1} \\ F_n \end{pmatrix} = \begin{pmatrix} 1 & 1 \\ 1 & 0 \end{pmatrix}^n \begin{pmatrix} 1 \\ 0 \end{pmatrix}

The characteristic polynomial λ2λ1\lambda^2 - \lambda - 1 has roots ϕ=1+52\phi = \frac{1+\sqrt{5}}{2} and ϕ^=152\hat{\phi} = \frac{1-\sqrt{5}}{2} (golden ratio!).

This gives Binet's formula: Fn=ϕnϕ^n5F_n = \frac{\phi^n - \hat{\phi}^n}{\sqrt{5}}

Key Takeaways

Definition

χA(λ)=det(λIA)\chi_A(\lambda) = \det(\lambda I - A)

Multiplicity Inequality

1g(λ)a(λ)1 \leq g(\lambda) \leq a(\lambda)

Vieta's Formulas

λi=tr(A)\sum\lambda_i = \text{tr}(A), λi=det(A)\prod\lambda_i = \det(A)

Similarity Invariance

Similar matrices have identical char poly

Quick Reference

2×2 Formula

χA(λ)=λ2tr(A)λ+det(A)\chi_A(\lambda) = \lambda^2 - \text{tr}(A)\lambda + \det(A)

λ=tr(A)±tr(A)24det(A)2\lambda = \frac{\text{tr}(A) \pm \sqrt{\text{tr}(A)^2 - 4\det(A)}}{2}

Key Coefficients

  • • Leading: 1 (monic)
  • λn1\lambda^{n-1} coeff: tr(A)-\text{tr}(A)
  • • Constant: (1)ndet(A)(-1)^n \det(A)

Additional Practice

Problem 1

Find the characteristic polynomial of A=(1221)A = \begin{pmatrix} 1 & 2 \\ 2 & 1 \end{pmatrix}.

Answer: λ22λ3=(λ3)(λ+1)\lambda^2 - 2\lambda - 3 = (\lambda - 3)(\lambda + 1)

Problem 2

A 4×4 matrix has eigenvalues 1, 1, 2, 2. Find its trace and determinant.

Answer: tr = 6, det = 4

Problem 3

If χA(λ)=λ410λ3+cλ2+dλ+24\chi_A(\lambda) = \lambda^4 - 10\lambda^3 + c\lambda^2 + d\lambda + 24 with eigenvalues 1, 2, 3, 4, find c.

Hint: c = sum of products of pairs = 1·2+1·3+1·4+2·3+2·4+3·4 = 35

Problem 4

For what values of aa does A=(a11a)A = \begin{pmatrix} a & 1 \\ 1 & a \end{pmatrix} have repeated eigenvalues?

Answer: Never (discriminant = 4 > 0 always)

Problem 5 (Challenge)

Prove that AA and ATA^T have the same characteristic polynomial.

11. Advanced Topics

Theorem 6.16: Cayley-Hamilton (Preview)

Every matrix satisfies its own characteristic equation:

χA(A)=An+c1An1++cn1A+cnI=0\chi_A(A) = A^n + c_1 A^{n-1} + \cdots + c_{n-1} A + c_n I = 0

This powerful result will be explored in detail in a later section.

Corollary 6.6: Matrix Polynomials

Any power AkA^k for knk \geq n can be expressed as a linear combination of I,A,A2,,An1I, A, A^2, \ldots, A^{n-1}.

Example 6.19: Using Cayley-Hamilton

For A=(1234)A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}, express A1A^{-1} as a linear combination of II and AA.

χA(λ)=λ25λ2\chi_A(\lambda) = \lambda^2 - 5\lambda - 2

By Cayley-Hamilton: A25A2I=0A^2 - 5A - 2I = 0

Rearranging: A25A=2IA^2 - 5A = 2I

A(A5I)=2IA1=12(A5I)A(A - 5I) = 2I \Rightarrow A^{-1} = \frac{1}{2}(A - 5I)

Theorem 6.17: Characteristic Polynomial of Block Triangular

For block upper triangular A=(BC0D)A = \begin{pmatrix} B & C \\ 0 & D \end{pmatrix}:

χA(λ)=χB(λ)χD(λ)\chi_A(\lambda) = \chi_B(\lambda) \cdot \chi_D(\lambda)
Proof:
det(λIA)=det(λIBC0λID)=det(λIB)det(λID)\det(\lambda I - A) = \det\begin{pmatrix} \lambda I - B & -C \\ 0 & \lambda I - D \end{pmatrix} = \det(\lambda I - B) \cdot \det(\lambda I - D)
Remark 6.11: Computational Note

This theorem is extremely useful for computing characteristic polynomials of large matrices that have block structure.

12. Real-World Examples

Example 6.20: Population Dynamics (Leslie Matrix)

A population model with three age groups:

L=(0210.50000.30)L = \begin{pmatrix} 0 & 2 & 1 \\ 0.5 & 0 & 0 \\ 0 & 0.3 & 0 \end{pmatrix}

The dominant eigenvalue determines long-term population growth rate.

χL(λ)=λ3+0.52λ+0.50.31=λ3+λ+0.15\chi_L(\lambda) = -\lambda^3 + 0.5 \cdot 2 \cdot \lambda + 0.5 \cdot 0.3 \cdot 1 = -\lambda^3 + \lambda + 0.15

The largest real root (≈ 1.1) indicates ~10% population growth per generation.

Example 6.21: Markov Chain

A transition matrix for weather states (sunny, cloudy, rainy):

P=(0.70.20.10.30.40.30.20.30.5)P = \begin{pmatrix} 0.7 & 0.2 & 0.1 \\ 0.3 & 0.4 & 0.3 \\ 0.2 & 0.3 & 0.5 \end{pmatrix}

For stochastic matrices:

  • λ=1\lambda = 1 is always an eigenvalue
  • All eigenvalues satisfy λ1|\lambda| \leq 1
  • The eigenvector for λ=1\lambda = 1 gives the stationary distribution
Example 6.22: Coupled Oscillators

Two masses connected by springs have equations of motion determined by:

M=(2112)M = \begin{pmatrix} 2 & -1 \\ -1 & 2 \end{pmatrix}

χM(λ)=(λ2)21=λ24λ+3=(λ1)(λ3)\chi_M(\lambda) = (\lambda - 2)^2 - 1 = \lambda^2 - 4\lambda + 3 = (\lambda - 1)(\lambda - 3)

Natural frequencies: ω1=1,ω2=3\omega_1 = 1, \omega_2 = \sqrt{3}

13. Computational Methods

Faddeev-LeVerrier Algorithm

Computes characteristic polynomial coefficients iteratively:

  1. Set B1=AB_1 = A, c1=tr(B1)c_1 = -\text{tr}(B_1)
  2. For k=2,,nk = 2, \ldots, n: Bk=A(Bk1+ck1I)B_k = A(B_{k-1} + c_{k-1}I), ck=1ktr(Bk)c_k = -\frac{1}{k}\text{tr}(B_k)
Remark 6.12: Numerical Stability

In practice, directly computing the characteristic polynomial and finding its roots is numerically unstable for large matrices. Instead:

  • QR iteration: Preferred for dense matrices
  • Lanczos method: For large sparse symmetric matrices
  • Arnoldi iteration: For large sparse non-symmetric matrices
Example 6.23: Rational Root Test

For χA(λ)=λ36λ2+11λ6\chi_A(\lambda) = \lambda^3 - 6\lambda^2 + 11\lambda - 6:

Possible rational roots: ±1,±2,±3,±6\pm 1, \pm 2, \pm 3, \pm 6

Testing: χA(1)=16+116=0\chi_A(1) = 1 - 6 + 11 - 6 = 0

Factor: (λ1)(λ25λ+6)=(λ1)(λ2)(λ3)(\lambda - 1)(\lambda^2 - 5\lambda + 6) = (\lambda - 1)(\lambda - 2)(\lambda - 3)

14. Deeper Theory

Theorem 6.18: Eigenspaces are Direct Sum

If λ1,,λk\lambda_1, \ldots, \lambda_k are distinct eigenvalues:

Eλ1+Eλ2++Eλk=Eλ1Eλ2EλkE_{\lambda_1} + E_{\lambda_2} + \cdots + E_{\lambda_k} = E_{\lambda_1} \oplus E_{\lambda_2} \oplus \cdots \oplus E_{\lambda_k}

The sum is always direct (no overlap except at 0).

Proof:

Eigenvectors corresponding to distinct eigenvalues are linearly independent (proved in previous section). If vEλijiEλjv \in E_{\lambda_i} \cap \sum_{j \neq i} E_{\lambda_j}, write v=jivjv = \sum_{j \neq i} v_j. Then both sides are eigenvectors for λi\lambda_i (left) and a combination for other eigenvalues (right). By linear independence, v=0v = 0.

Corollary 6.7: Dimension Bound

For distinct eigenvalues λ1,,λk\lambda_1, \ldots, \lambda_k:

i=1kg(λi)=i=1kdimEλin\sum_{i=1}^{k} g(\lambda_i) = \sum_{i=1}^{k} \dim E_{\lambda_i} \leq n
Theorem 6.19: Trace Formula

For any matrix AA:

tr(Ak)=i=1nλik\text{tr}(A^k) = \sum_{i=1}^{n} \lambda_i^k

where λi\lambda_i are eigenvalues counted with multiplicity.

Example 6.24: Using Trace Formula

If χA(λ)=(λ1)2(λ2)\chi_A(\lambda) = (\lambda - 1)^2(\lambda - 2), find tr(A2)\text{tr}(A^2).

Solution: tr(A2)=12+12+22=1+1+4=6\text{tr}(A^2) = 1^2 + 1^2 + 2^2 = 1 + 1 + 4 = 6

Study Tips

  • Master 2×2: The formula χA=λ2tr(A)λ+det(A)\chi_A = \lambda^2 - \text{tr}(A)\lambda + \det(A) should be instant.
  • Use verification: Always check trace = sum, det = product of eigenvalues.
  • Exploit structure: Triangular and block diagonal matrices have simple char polys.
  • Think about multiplicities: Understand both algebraic and geometric.
  • Know the bounds: 1ga1 \leq g \leq a is fundamental.
  • Practice factoring: Rational root test and special patterns help.

Characteristic Polynomial Patterns

Matrix TypeCharacteristic PolynomialNotes
Identity InI_n(λ1)n(\lambda - 1)^nSingle eigenvalue 1
Scalar cIncI_n(λc)n(\lambda - c)^nSingle eigenvalue c
Diagonali(λdi)\prod_i (\lambda - d_i)Eigenvalues = diagonal
Triangulari(λaii)\prod_i (\lambda - a_{ii})Same as diagonal case
Nilpotentλn\lambda^nOnly eigenvalue is 0
Projectionλnr(λ1)r\lambda^{n-r}(\lambda - 1)^rr = rank(P)
2D Rotation by θλ22cosθλ+1\lambda^2 - 2\cos\theta \cdot \lambda + 1Eigenvalues e±iθe^{\pm i\theta}
Companion matrixGiven polynomialChar poly = input poly

Chapter Summary

The characteristic polynomial χA(λ)=det(λIA)\chi_A(\lambda) = \det(\lambda I - A) encodes all eigenvalue information. Its roots are eigenvalues, with algebraic multiplicity given by root multiplicity. Vieta's formulas connect coefficients to trace and determinant. The fundamental theorem of algebra guarantees existence over C\mathbb{C}.

19

Theorems

24

Examples

12

Quiz Questions

8

FAQs

Historical Notes

Cauchy (1826): First studied eigenvalue equations systematically for quadratic forms.

Sylvester (1852): Introduced the term "matrix" and studied determinantal equations.

Cayley (1858): Stated the Cayley-Hamilton theorem: every matrix satisfies its own characteristic equation.

Frobenius (1878): Developed the theory of characteristic and minimal polynomials systematically.

Weierstrass (1868): Introduced elementary divisors, refining the study of characteristic polynomials.

Terminology: "Characteristic" comes from the idea that this polynomial characterizes essential properties of the matrix.

15. Conceptual Questions

Q1: Can two non-similar matrices have the same characteristic polynomial?

Yes! Example: (0100)\begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} and (0000)\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix} both have χ=λ2\chi = \lambda^2, but are not similar.

Q2: If all eigenvalues are 0, must the matrix be 0?

No! Nilpotent matrices have all eigenvalues 0 but may be non-zero. Example: (0100)\begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}.

Q3: Can eigenvalues be complex for a real matrix?

Yes! But they come in conjugate pairs. Example: (0110)\begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix} has eigenvalues ±i\pm i.

Q4: Is the characteristic polynomial unique up to scalar multiple?

No, it's completely unique. We define it as a monic polynomial (leading coefficient = 1).

Q5: Does A invertible imply all eigenvalues are non-zero?

Yes! det(A) = product of eigenvalues, so A invertible ⟺ det(A) ≠ 0 ⟺ no eigenvalue is 0.

16. Challenge Problems

Challenge 1

Prove that if AA is idempotent (A2=AA^2 = A), then the only possible eigenvalues are 0 and 1.

Challenge 2

If χA(λ)=λn+c1λn1++cn\chi_A(\lambda) = \lambda^n + c_1\lambda^{n-1} + \cdots + c_n, prove c2=i<jλiλjc_2 = \sum_{i < j} \lambda_i \lambda_j equals the sum of all 2×2 principal minors.

Challenge 3

Let AA be a 3×3 matrix with χA(λ)=λ33λ+2\chi_A(\lambda) = \lambda^3 - 3\lambda + 2. Find tr(A3)\text{tr}(A^3).

Hint: Eigenvalues are 1, 1, -2. Use tr(A3)=λi3\text{tr}(A^3) = \sum \lambda_i^3.

Challenge 4

Show that for any polynomial p(λ)p(\lambda) of degree nn, there exists an nn×nn matrix whose characteristic polynomial is p(λ)p(\lambda).

Hint: Use the companion matrix construction.

Challenge 5

Prove that χAB(λ)=χBA(λ)\chi_{AB}(\lambda) = \chi_{BA}(\lambda) for any nn×nn matrices AA and BB.

17. Exam Preparation

What You Should Know

  • • Compute χA\chi_A for 2×2 and 3×3 matrices
  • • Factor simple characteristic polynomials
  • • Find algebraic and geometric multiplicities
  • • Apply Vieta's formulas
  • • Determine diagonalizability from multiplicities
  • • Handle triangular and block matrices

Common Exam Questions

  • • Find the characteristic polynomial
  • • Verify eigenvalues using trace/det
  • • Determine if matrix is diagonalizable
  • • Find missing eigenvalue given others
  • • True/False on multiplicity properties
  • • Prove similarity invariance results

Verification Checklist

After computing eigenvalues, always verify:

Sum of eigenvalues = tr(A)
Product of eigenvalues = det(A)
Sum of algebraic multiplicities = n
1g(λ)a(λ)1 \leq g(\lambda) \leq a(\lambda) for each
Polynomial has correct degree n
Leading coefficient is 1 (monic)

What's Next?

With the characteristic polynomial understood, you're ready for:

  • Diagonalization: When can we write A=PDP1A = PDP^{-1}? The condition g=ag = a for all eigenvalues.
  • Jordan Normal Form: What to do when diagonalization fails—generalized eigenvectors.
  • Cayley-Hamilton Theorem: The remarkable fact that χA(A)=0\chi_A(A) = 0 always holds.
  • Minimal Polynomial: The smallest polynomial annihilating the matrix.

18. Quick Examples Gallery

Example: Symmetric 2×2

A=(3113)A = \begin{pmatrix} 3 & 1 \\ 1 & 3 \end{pmatrix}

χA=λ26λ+8=(λ2)(λ4)\chi_A = \lambda^2 - 6\lambda + 8 = (\lambda - 2)(\lambda - 4)

Eigenvalues: 2, 4 (real, distinct)

Example: Skew-Symmetric 2×2

A=(0220)A = \begin{pmatrix} 0 & 2 \\ -2 & 0 \end{pmatrix}

χA=λ2+4\chi_A = \lambda^2 + 4

Eigenvalues: ±2i\pm 2i (pure imaginary)

Example: 3×3 with Zero Eigenvalue

A=(123123123)A = \begin{pmatrix} 1 & 2 & 3 \\ 1 & 2 & 3 \\ 1 & 2 & 3 \end{pmatrix}

rank = 1, so eigenvalue 0 has a(0)=2a(0) = 2

χA=λ2(λ6)\chi_A = \lambda^2(\lambda - 6)

Example: Permutation Matrix

P=(010001100)P = \begin{pmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \end{pmatrix} (cyclic)

χP=λ31\chi_P = \lambda^3 - 1

Eigenvalues: 1, ω,ω2\omega, \omega^2 (cube roots of 1)

Memorization Aids

2×2 Formula

"Trace minus lambda, then add det"

λ2trλ+det\lambda^2 - \text{tr}\cdot\lambda + \det

Multiplicity Inequality

"Geometric ≤ Algebraic, like g ≤ a alphabetically"

Vieta's Formulas

"Sum = Trace, Product = Determinant"

Triangular Matrices

"Eigenvalues on the diagonal"

Learning Path

Previous

6.1 Eigenvalue Definition

Current

6.2 Characteristic Polynomial

Next

6.3 Diagonalization

Module Summary

In this module, you learned how the characteristic polynomial χA(λ)=det(λIA)\chi_A(\lambda) = \det(\lambda I - A) encodes all eigenvalue information. You can now:

  • Compute characteristic polynomials for 2×2 and 3×3 matrices
  • Distinguish between algebraic and geometric multiplicities
  • Apply Vieta's formulas to verify eigenvalue computations
  • Recognize special cases (triangular, block diagonal, similar matrices)
  • Understand when matrices are diagonalizable based on multiplicities

Related Topics

Eigenvalue Definition
Diagonalization
Jordan Form
Minimal Polynomial
Cayley-Hamilton
Spectral Theorem
Matrix Functions
Companion Matrix
Characteristic Polynomial Practice
12
Questions
0
Correct
0%
Accuracy
1
The characteristic polynomial of AA is:
Easy
Not attempted
2
The degree of the characteristic polynomial for n×nn \times n matrix is:
Easy
Not attempted
3
Sum of eigenvalues (with multiplicity) equals:
Medium
Not attempted
4
Product of eigenvalues (with multiplicity) equals:
Medium
Not attempted
5
Algebraic multiplicity of λ0\lambda_0 is:
Medium
Not attempted
6
Geometric multiplicity satisfies:
Hard
Not attempted
7
Similar matrices AA and P1APP^{-1}AP have:
Medium
Not attempted
8
For a triangular matrix, eigenvalues are:
Easy
Not attempted
9
If χA(λ)=(λ2)3(λ+1)\chi_A(\lambda) = (\lambda - 2)^3(\lambda + 1), trace of AA is:
Medium
Not attempted
10
If χA(λ)=(λ2)3(λ+1)\chi_A(\lambda) = (\lambda - 2)^3(\lambda + 1), det of AA is:
Medium
Not attempted
11
An n×nn×n matrix over C\mathbb{C} has:
Medium
Not attempted
12
If geometric = algebraic multiplicity for all eigenvalues, then:
Hard
Not attempted

Frequently Asked Questions

Why is it called 'characteristic' polynomial?

It 'characterizes' the matrix's essential spectral properties - its eigenvalues. Similar matrices have the same characteristic polynomial, making it an invariant under similarity transformations.

What if the characteristic polynomial doesn't factor over ℝ?

Then there are no real eigenvalues. Over ℂ, by the Fundamental Theorem of Algebra, it always factors completely into linear terms, giving exactly n eigenvalues (counting multiplicity).

What's the difference between algebraic and geometric multiplicity?

Algebraic = exponent in char poly (how many times λ₀ is a root). Geometric = dim(ker(A - λ₀I)) = number of linearly independent eigenvectors. Always: 1 ≤ geometric ≤ algebraic.

How do I find eigenvalues of a 3×3 matrix?

Expand det(A - λI) to get a cubic polynomial. Try rational roots (±divisors of constant/leading coeff), look for special structure, or factor by grouping. Numerical methods work for harder cases.

Why is det(λI - A) sometimes used instead of det(A - λI)?

They differ by (-1)ⁿ. Using λI - A makes the polynomial monic (leading coefficient 1). Both give the same eigenvalues.

How does the characteristic polynomial relate to the minimal polynomial?

The minimal polynomial divides the characteristic polynomial. They have the same roots (eigenvalues) but possibly different multiplicities. The minimal polynomial has the smallest degree annihilating A.

What happens to the characteristic polynomial under similarity?

It's invariant! If B = P⁻¹AP, then det(λI - B) = det(P⁻¹(λI - A)P) = det(λI - A). This is why eigenvalues are similarity invariants.

Can two different matrices have the same characteristic polynomial?

Yes! Non-similar matrices can have the same char poly if their Jordan structures differ. For example, I₂ and a non-identity 2×2 with eigenvalue 1 (multiplicity 2) both have char poly (λ-1)².