Matrices are the computational representation of linear maps. Understanding matrix operations and special matrix types is essential for both theory and computation.
Let be a linear map, a basis of , and a basis of .
The matrix of with respect to and , denoted , is the matrix whose -th column is (the coordinate vector of in basis ).
If and , then:
Let be defined by .
Using the standard basis :
So the matrix is:
If and are linear maps, then:
where are bases of respectively.
The matrix representation of a linear map depends on the choice of bases. Changing bases changes the matrix, but the underlying linear map remains the same.
For matrices and over field :
If is and is , then is with:
This is the dot product of row of with column of .
Matrix multiplication satisfies:
Note: Matrix multiplication is NOT commutative in general.
Let and .
Then:
So .
The transpose of an matrix , denoted , is the matrix with:
For matrices and and scalar :
For and :
where is the -th column of . So is a linear combination of the columns of .
The column space of is , which equals where .
A square matrix is diagonal if for all .
We write .
A square matrix is:
A square matrix is:
Every square matrix can be uniquely written as:
where is symmetric and is skew-symmetric.
A square matrix is orthogonal if:
Equivalently, .
For an orthogonal matrix :
A square matrix is:
For complex matrices:
When we change bases, the matrix representation of a linear map changes. Understanding this relationship is crucial for diagonalization, similarity, and many other applications.
Let be a linear operator, and let and be two bases of . If is the change of basis matrix from to , then:
For , let and be coordinate vectors. Then:
Also:
Since this holds for all , we get , so .
Two matrices and are similar if there exists an invertible matrix such that:
Similar matrices represent the same linear operator in different bases.
Similarity is an equivalence relation:
If , then:
Let be .
In standard basis :
In basis :
Change of basis matrix:
Then (diagonal!)
We can raise matrices to powers and evaluate polynomials at matrices. These operations are fundamental for solving matrix equations and understanding matrix functions.
For a square matrix and positive integer :
We define (identity matrix).
For square matrices and integers :
For a polynomial and a square matrix , define:
If and , then:
For polynomials and , and square matrix :
If with diagonal, then:
Since is just the diagonal matrix with entries raised to the -th power, this makes computing much easier.
Block matrices partition a matrix into submatrices (blocks). This perspective simplifies many computations and reveals structural properties.
A block matrix (or partitioned matrix) is a matrix written as:
where each is a submatrix (block) of appropriate size.
If and are block matrices with compatible block sizes, then can be computed block-wise:
provided the block dimensions are compatible for multiplication.
For and :
A block diagonal matrix has the form:
where are square matrices and off-diagonal blocks are zero.
If is block diagonal, then:
A block upper triangular matrix has the form:
where and are square blocks and the lower-left block is zero.
Its determinant is .
Block matrices are useful for:
It's designed so that [S ∘ T] = [S][T]—the matrix of a composition is the product of matrices. The row-column dot product formula follows directly from how coordinates transform under composition of linear maps.
Geometrically: rotate then reflect ≠ reflect then rotate. Algebraically: the composition of linear maps isn't commutative, and matrix multiplication represents composition. Even when both AB and BA are defined, they usually give different results.
Similar matrices represent the same linear operator in different bases. They share many properties: determinant, trace, eigenvalues, characteristic polynomial, and rank. If B = P⁻¹AP, then A and B are similar, connected by the change of basis matrix P.
Special matrices have structure that simplifies computations and reveals properties. Diagonal matrices are easy to invert and power, triangular matrices make solving systems efficient, and symmetric matrices have real eigenvalues with orthogonal eigenvectors.
Orthogonal matrices represent isometries—transformations that preserve lengths and angles. In 2D/3D, these are rotations (det = 1) and reflections (det = -1). They preserve the dot product: ⟨Ax, Ay⟩ = ⟨x, y⟩.