The inverse of a matrix undoes its action. Invertible matrices correspond to bijective linear maps.
Just as a bijective function has an inverse that "undoes" it, an invertible linear map has an inverse map. The matrix inverse is the matrix representation of this inverse map.
Let be an matrix. If there exists a matrix such that:
then is called invertible (or nonsingular), and is called the inverse of , denoted .
A square matrix that is NOT invertible is called singular. The terminology comes from the fact that singular matrices represent "exceptional" or "degenerate" cases.
If is invertible, then its inverse is unique.
Suppose and both satisfy and . Then:
Therefore , proving uniqueness.
A matrix is invertible if and only if the linear map it represents is an isomorphism (bijective). The inverse matrix represents the inverse map.
For square matrices :
Suppose . We show is invertible with inverse :
Step 1: Show columns of are linearly independent.
If , then .
Step 2: Since has independent columns, it's invertible.
Step 3: From , right-multiply by : .
Therefore .
This fails for non-square matrices! A left inverse doesn't imply a right inverse:
Then but .
For an matrix , the following are equivalent (TFAE):
This theorem is central to linear algebra. It connects seemingly different concepts: invertibility, determinants, rank, independence, solvability of equations, and elementary matrices.
For invertible matrices , and scalar :
(2): Verify directly:
By uniqueness, .
(5): From , take transpose of both sides:
This shows is a left inverse of . For square matrices, this means .
For invertible matrices :
If is invertible and , then . Multiply both sides by on the left.
To compute :
Find the inverse of .
Solution: Form and row reduce:
After row operations: , then eliminate to get:
Therefore:
Row reducing applies the same sequence of elementary row operations to both and . If reduces to , then becomes because elementary operations are equivalent to left-multiplying by elementary matrices.
If is invertible, the unique solution to is:
Multiply both sides of by on the left:
While the formula is elegant, in practice:
For with :
Example:
If with all :
Each diagonal entry is replaced by its reciprocal.
Problem: If and , find .
Solution:
Problem: For which values of is singular?
Solution: is singular iff :
is singular when or .
. The correct formula is . Remember: order reverses, just like for transpose.
Only square matrices can have (two-sided) inverses. An matrix with cannot be invertible, though it may have one-sided inverses or pseudoinverses.
Many square matrices are singular (not invertible). Check the determinant, rank, or column independence before assuming invertibility.
. There is no simple formula for the inverse of a sum.
. The inverse undoes the matrix.
iff invertible iff .
. Order reverses!
Row reduce to .
| Property | Formula |
|---|---|
| Definition | |
| Double Inverse | |
| Product Inverse | |
| Transpose-Inverse | |
| Scalar Inverse | |
| Determinant |
Problem 1
Compute the inverse of using the 2×2 formula.
Problem 2
Prove that if (involutory matrix), then .
Problem 3
If is invertible, show that for all positive integers .
Problem 4
Find the inverse of using row reduction.
Problem 5
Prove that if and are invertible and commute (), then and also commute.
Problem 6
For which values of is invertible?
For special matrix types:
The rotation matrix is orthogonal:
Rotating by then by returns to the original position.
For :
The inverse is also upper triangular.
is invertible iff . Furthermore:
The adjugate formula:
is invertible iff 0 is NOT an eigenvalue. For invertible :
Eigenvectors are the same for and .
is invertible iff the linear map it represents is an isomorphism:
If is the change of basis matrix, then:
Change of basis matrices are always invertible.
Prove that if is invertible and are column vectors, then:
(provided )
Find the inverse of where and are invertible square matrices.
Prove that if is nilpotent ( for some ), then is invertible with:
The set of all invertible matrices over forms a group under multiplication, denoted :
stands for "General Linear group". When , we write ; when , we write .
Over or , the set of invertible matrices is "generic":
If is singular, then for almost any matrix , the perturbed matrix is invertible for small .
Only square matrices can be invertible. Before computing an inverse, verify the matrix is square.
After computing , always verify by checking .
For matrices, the formula is fastest.
. This is the same reversal that happens with transpose.
Problem: Solve .
Solution: First find :
Then:
Problem: Solve for where , , are known and , are invertible.
Solution:
Problem: If , find .
Solution: From :
This shows is the inverse of :
Problem: If and both are invertible, show commutes with .
Solution: We have .
If is invertible:
Multiply by on left:
If represents a transformation (rotation, scaling, shear, etc.), then represents the inverse transformation that "undoes" it:
A matrix is singular (not invertible) when it "collapses" dimension:
The determinant represents the factor by which scales volumes:
Cayley's Contributions: Arthur Cayley developed the theory of matrix inverses in his 1858 paper. He recognized that invertibility corresponds to non-zero determinants and developed the adjugate formula.
Gaussian Elimination: While named after Carl Friedrich Gauss (early 1800s), the method was known to Chinese mathematicians nearly 2000 years earlier in texts like "The Nine Chapters on the Mathematical Art." It remains the standard algorithm for computing inverses.
General Linear Group: The group became central to modern mathematics through the work of Sophus Lie, Felix Klein, and others in the late 19th century. It connects linear algebra to group theory and geometry through the Erlangen program.
Numerical Computing: In the computer age, efficient and stable algorithms for matrix inversion became crucial. LU decomposition, developed by Alan Turing and others in the 1940s-50s, is now the standard approach for numerical work.
Modern Applications: Matrix inverses appear throughout science and engineering: solving systems of equations, computer graphics (inverting transformations), machine learning (solving linear regression), quantum mechanics, and control theory.
For a (possibly non-square) matrix :
For an matrix :
Let . A left inverse is any for any :
Left inverses are NOT unique for non-square matrices!
The Moore-Penrose pseudoinverse generalizes the inverse to all matrices (including non-square and singular). It satisfies:
. Theoretical but LU is faster in practice.
Inverse transformations for camera models, view projections, and animation.
Hill cipher uses matrix multiplication for encryption and inverse for decryption.
Covariance matrix inverse in multivariate distributions and regression.
Now that you understand matrix inverses, the next topics build on this foundation:
These concepts together form the theoretical backbone of computational linear algebra.