Certain matrices possess special structure that makes them particularly important in theory and applications. We study diagonal, triangular, symmetric, orthogonal, idempotent, and nilpotent matrices—each with distinctive properties that simplify computations and illuminate the underlying linear transformations.
A square matrix is diagonal if whenever .
Let and . Then:
All properties follow from direct computation. For (2): . Since both are diagonal, only contributes, giving .
For (6): where is the -th standard basis vector.
Let and .
A scalar matrix is diagonal with all entries equal. Scalar matrices commute with all matrices:
In fact, scalar matrices are the only matrices that commute with all matrices.
For , compute :
In general, for any function .
For , compute :
This is trivial for diagonal matrices but would be hard to compute directly for non-diagonal matrices.
Any two diagonal matrices commute: if are diagonal, then .
.
A block diagonal matrix has square blocks along the diagonal:
Properties: , inverse exists iff each block is invertible.
For where is and is :
A matrix is:
Let , , be upper triangular. Then:
For (2): Let . For :
When , . When , . So .
The product is upper triangular with diagonal entries .
A triangular matrix is invertible if and only if all diagonal entries are nonzero.
Many matrices can be factored as where is lower triangular and is upper triangular. This is fundamental for solving linear systems efficiently.
For :
Eigenvalues are 3, 2, -1 (the diagonal entries).
Here has 1s on diagonal (unit lower triangular), is upper triangular.
If is upper triangular and invertible, then can be computed in operations using back-substitution, rather than for general matrices.
A block upper triangular matrix has the form:
Its determinant is .
A matrix is symmetric if , i.e., for all .
A matrix is skew-symmetric if , i.e., .
Let , be symmetric. Then:
For (4): .
For (5): . This equals iff they commute.
Let be skew-symmetric. Then:
For (1): implies .
For (3): . If is odd, .
Every matrix can be uniquely written as where:
For :
This is symmetric: each entry equals its reflection across the diagonal. All eigenvalues will be real.
Note the zero diagonal and . Since is odd, .
Every quadratic form can be written using a symmetric matrix:
The skew-symmetric part contributes nothing since for skew-symmetric .
The Spectral Theorem states: every real symmetric matrix is orthogonally diagonalizable:
where is orthogonal and is diagonal. This is one of the most important theorems in linear algebra.
For :
Eigenvalues: . Eigenvectors: .
A real matrix is orthogonal if , equivalently .
The following are equivalent for :
For (1): .
For (4): If , then .
Rotation by angle counterclockwise:
(proper orthogonal = rotation).
Reflection across the line :
and (improper orthogonal = reflection).
Rotation by around the -axis:
This is orthogonal with . The axis of rotation is the -axis (eigenvector with eigenvalue 1).
Reflection through a hyperplane perpendicular to unit vector :
Verify: (using ).
Also and .
Every orthogonal matrix is either:
The set of orthogonal matrices forms a group under multiplication:
The subgroup with consists of rotations only.
For characterization (4) ⇒ (5): Set to get .
For (5) ⇒ (4): Use polarization identity:
A matrix is idempotent if .
For (2): If , then , so .
For (3): .
Project onto :
Verify: , eigenvalues are 0 and 1, trace = 1 = rank.
An idempotent is an orthogonal projection iff .
If is idempotent, then:
Project onto the -plane in :
Then projects onto the -axis. Both are orthogonal projections.
Verify . This projects onto along . Note , so this is not an orthogonal projection.
Idempotent matrices appear throughout mathematics:
A matrix is nilpotent if for some . The smallest such is the index of nilpotency.
Index of nilpotency is 3.
For property (1): If with , then:
For property (3), verify by expanding:
All nilpotent matrices have the form:
Example: has .
For with :
Verify: .
Nilpotent matrices are key building blocks in Jordan canonical form. Every nilpotent matrix is similar to a direct sum of nilpotent Jordan blocks:
If and are nilpotent and (they commute), then is nilpotent.
If and , use binomial theorem (valid since they commute):
Each term has either (so ) or (so ).
A matrix is an involution if .
A complex matrix is normal if . Examples include symmetric, skew-symmetric, orthogonal, Hermitian, and unitary matrices. Normal matrices are unitarily diagonalizable.
For involution eigenvalues: If , then , so .
A complex matrix is unitary if , where .
Unitary matrices are the complex analogue of orthogonal matrices.
A complex matrix is Hermitian if .
Hermitian matrices are the complex analogue of symmetric matrices. All eigenvalues are real.
Note . The eigenvalues are , both real.
A symmetric (or Hermitian) matrix is positive definite if for all . Equivalently:
Check: and . Or eigenvalues are both positive.
If , symmetric, is symmetric only if they commute.
Orthogonal: . Symmetric: . Different conditions!
Nilpotent means , not .
forces .
The study of symmetric matrices goes back to Lagrange and Laplace in the 18th century. The spectral theorem for symmetric matrices was developed by Cauchy (1829) and later refined by Sylvester and others.
Orthogonal transformations were studied by Euler in relation to rigid body motion. The term "orthogonal" comes from Greek "orthos" (right) and "gonia" (angle), reflecting the right-angle preservation property.
Camille Jordan (1870) developed the canonical form that bears his name, giving a complete classification of linear transformations via nilpotent matrices and diagonal blocks.
Named after Charles Hermite (1822-1901), who studied quadratic forms over the complex numbers. Hermitian matrices are fundamental in quantum mechanics, where observables are represented by Hermitian operators.
Eigenvalues on diagonal. Easy to invert, multiply, and solve systems.
Real eigenvalues, orthogonal eigenvectors. Diagonalizable by orthogonal matrix.
Preserve lengths and angles. . Rotations and reflections.
Projections () and nilpotent () key in Jordan form.
Create flashcards for: symmetric (), orthogonal (), idempotent (), nilpotent ().
Given a matrix, practice quickly checking which special properties it has. Start with 2×2 matrices before moving to larger ones.
Orthogonal = rotations/reflections. Projections = shadows. Visualizing helps remember properties.
Many special matrices restrict possible eigenvalues: symmetric → real, orthogonal → |λ| = 1, idempotent → {0,1}, nilpotent → all 0.
In the next chapter on Determinants, you will see how special matrix structure simplifies determinant computation:
Later, in Eigenvalues, the spectral theorem will show that symmetric matrices are always diagonalizable with orthogonal eigenvectors—one of the most important results in linear algebra.
| Type | Definition | Key Property | Eigenvalues |
|---|---|---|---|
| Diagonal | for | Powers/inverse easy | Diagonal entries |
| Triangular | Zeros above/below diagonal | Products stay triangular | Diagonal entries |
| Symmetric | Orthogonal eigenvectors | All real | |
| Skew-symmetric | Zero diagonal | Purely imaginary | |
| Orthogonal | Preserves length | ||
| Idempotent | Projection | 0 or 1 | |
| Nilpotent | invertible | All 0 | |
| Involution | ±1 |
Given , classify this matrix.
Solution:
Find the orthogonal projection onto .
Solution:
For a line spanned by unit vector , projection is .
First normalize: , so .
Verify: , , , eigenvalues are 0, 0, 1.
Show that is orthogonal.
Solution:
Check that columns are orthonormal:
Thus . Also , so is a rotation.
Decompose into symmetric + skew-symmetric.
Solution:
Verify: , , .
For , show is nilpotent and find .
Solution:
So is nilpotent with index 2. Thus:
Verify: .
Orthogonal matrices represent rotations and reflections in 3D graphics. Transformation matrices preserve shape when rendering objects.
Hermitian operators represent observables with real eigenvalues. Unitary operators represent time evolution preserving probability.
Projection matrices compute least squares fits. Covariance matrices are symmetric positive semi-definite.
LU decomposition uses triangular matrices for efficient solving. QR decomposition uses orthogonal matrices for stability.
Nilpotent matrices describe systems that "stop" after finite time. Stability analysis uses eigenvalue locations.
Symmetric matrices appear in kernel methods. PCA uses eigenvectors of covariance (symmetric) matrices.
Understanding how special matrix types relate to each other:
A matrix is both orthogonal and symmetric if and only if it is an involution with eigenvalues .
If is orthogonal () and symmetric (), then .
Conversely, if and , then , so is orthogonal.
Several important decomposition theorems relate special matrix types:
| Matrix Type | Operation | General | Special |
|---|---|---|---|
| Diagonal | Inverse | ||
| Triangular | Solve | ||
| Symmetric | Eigenvalues | (faster const) | |
| Orthogonal | Inverse | (transpose) | |
| Sparse | Multiply |
Special structure often improves numerical stability:
To solve where (symmetric factorization):
Total: after factorization, vs for general Gaussian elimination.
Special matrices can be stored more efficiently:
Special matrices have structure that simplifies computations and reveals properties. Diagonal matrices are easy to invert and power, triangular matrices make solving systems efficient, and symmetric matrices have real eigenvalues with orthogonal eigenvectors.
Orthogonal matrices represent isometries—transformations that preserve lengths and angles. In 2D/3D, these are rotations (det = 1) and reflections (det = -1). They preserve the dot product: ⟨Ax, Ay⟩ = ⟨x, y⟩.
Check if A = A^T, i.e., whether a_{ij} = a_{ji} for all i, j. Visually, check if the matrix equals its reflection across the main diagonal.
Idempotent matrices represent projections. If P² = P, then P projects vectors onto its image. Applying the projection twice gives the same result—hence P² = P. They're central in statistics and quantum mechanics.
A matrix N is nilpotent if N^k = 0 for some positive integer k. The smallest such k is the index of nilpotency. Nilpotent matrices have all eigenvalues equal to zero and appear in Jordan forms.
For triangular A, det(A - λI) is the product of (a_{ii} - λ) terms. This equals zero exactly when λ equals some diagonal entry.
Every matrix A can be uniquely written as A = S + K where S = (A + A^T)/2 is symmetric and K = (A - A^T)/2 is skew-symmetric. This is analogous to decomposing a function into even and odd parts.
Generally no! If A and B are symmetric, (AB)^T = B^T A^T = BA, which equals AB only if A and B commute. However, B^T A B is always symmetric when A is symmetric.
If A is orthogonal and symmetric, then A² = A·A^T = I. So A is an involution. The only such matrices have eigenvalues ±1: they're reflections across subspaces.
Diagonal: eigenvalues are diagonal entries. Triangular: same. Symmetric: always diagonalizable with orthogonal eigenvectors. Orthogonal: eigenvalues have |λ| = 1. This structure makes analysis much easier.
Orthogonal matrices are for real vector spaces: Q^T Q = I. Unitary matrices are for complex spaces: U* U = I where U* is conjugate transpose. Both preserve inner products in their respective spaces.
If V has orthonormal basis {v₁,...,vₖ}, then P = Σᵢ vᵢvᵢᵀ = VVᵀ where V = [v₁|...|vₖ]. If the basis isn't orthonormal, use P = V(VᵀV)⁻¹Vᵀ.
Every positive definite matrix A can be written as A = LLᵀ where L is lower triangular with positive diagonal. This is more efficient than LU for symmetric positive definite systems.
A matrix is diagonalizable iff it has n linearly independent eigenvectors. Sufficient conditions: distinct eigenvalues, or being normal (including symmetric, orthogonal, Hermitian, unitary).
Each row is a cyclic shift of the row above. Circulants are diagonalized by the DFT matrix and appear in signal processing. They commute with each other.