An isomorphism is a bijective linear map—the strongest type of linear map. Isomorphic vector spaces are algebraically indistinguishable: they're "the same" for linear algebra.
The concept of isomorphism is one of the most important in all of mathematics. It comes from the Greek words "isos" (equal) and "morphe" (form)—meaning "same form."
The classification theorem—that finite-dimensional spaces are isomorphic if and only if they have the same dimension—is remarkably elegant. It says that dimension is the only invariantthat matters. This is why we can reduce all finite-dimensional linear algebra to studying .
An isomorphism is the "perfect" linear map—it's linear, injective, and surjective.
A linear map is an isomorphism if it is bijective (both injective and surjective).
If such a exists, we say and are isomorphic and write .
An automorphism of is an isomorphism (from to itself).
The set of all automorphisms of is denoted or .
For a linear map with finite-dimensional, the following are equivalent:
(1) ⟺ (2): By definition.
(2) ⟺ (3): Injective ⟺ ; surjective ⟺ .
(3) ⟹ (4): Let be a basis of . Since is injective, is linearly independent. Since , these vectors span . Hence they form a basis.
(4) ⟹ (3): If maps a basis to a basis, then is clearly bijective.
(3) ⟺ (5) ⟺ (6): By Rank-Nullity, when : injective ⟺ surjective ⟺ bijective.
(2) ⟺ (7): has an inverse function iff is bijective. We must show is linear:
Similarly for scalar multiplication.
This is one of the most elegant theorems in linear algebra: dimension is the complete invariant.
Two finite-dimensional vector spaces over the same field are isomorphic if and only if they have the same dimension.
(⟹) Necessity:
Suppose is an isomorphism. Then is injective, so . By Rank-Nullity:
(⟸) Sufficiency:
Suppose . Let be a basis for and be a basis for .
Define by and extend linearly:
T is injective: If , write . Then. Since is a basis, all , so .
T is surjective: Any equals .
Every -dimensional vector space over is isomorphic to :
This corollary is profound: it means we can study any finite-dimensional space using coordinates! Abstract vector spaces (polynomials, matrices, functions) can all be reduced to .
The most important isomorphism: the coordinate map with respect to a chosen basis.
Let be an ordered basis for . The coordinate map with respect to is:
The tuple is called the coordinate vector of with respect to .
For any ordered basis of , the coordinate map is an isomorphism.
Linearity: If and , then:
Similarly for scalar multiplication.
Bijectivity: Every vector has unique coordinates (by definition of basis), so is bijective.
Problem: Find the coordinate isomorphism from to .
Solution: Using the standard basis :
For example:
Isomorphisms preserve all linear algebra properties. This is what makes isomorphic spaces "the same."
Let be an isomorphism. Then:
Suppose is linearly independent. If , then:
So , and by independence of , all .
The converse follows by applying the same argument to .
If is an isomorphism and , then:
where .
If is an isomorphism:
Problem: Show (as a real vector space) is isomorphic to .
Solution: Both have dimension 2 over .
Define by .
Problem: Are and isomorphic?
Solution: Yes! Both have dimension 4.
: basis , dimension 4.
: basis , dimension 4.
One isomorphism:
Problem: Are and isomorphic?
Solution: No! They have different dimensions (3 ≠ 4).
By the Classification Theorem, no isomorphism can exist.
Problem: Is differentiation an isomorphism?
Solution: No, for two reasons:
The automorphisms of a space form an important algebraic structure: a group.
The set of all automorphisms of forms a group under composition:
For an -dimensional space over :
where is the group of invertible matrices over .
For of dimension over a finite field :
Problem: Construct an explicit isomorphism .
Solution:
Choose basis for and standard basis for .
Define and .
Then .
Verification: T maps basis to basis, so T is an isomorphism.
Problem: To what is the space of symmetric matrices isomorphic?
Solution:
A symmetric matrix has the form:
This requires 3 independent parameters. Dimension = 3.
Answer:
Problem: Is with an isomorphism?
Solution:
Check injectivity: and .
Adding: . Subtracting: .
So .
Since and is injective, is an isomorphism.
does NOT mean ! Isomorphic spaces have the same algebraic structure but may consist of completely different objects (polynomials vs. matrices).
An isomorphism is a function, not an equality. There are typically infinitely many different isomorphisms between isomorphic spaces.
The isomorphism depends on the choice of basis! Different bases give different isomorphisms. There is no "canonical" choice.
The Classification Theorem requires finite dimensions! In infinite dimensions, two spaces can have "the same" dimension but not be isomorphic.
Isomorphism = bijective linear map
⟺
Every -dim space ≅
Isomorphisms preserve all linear algebra properties
Problem 1
Find all vector spaces (up to isomorphism) of dimension 3 over . How many are there?
Problem 2
Prove that the space of upper triangular matrices is isomorphic to.
Problem 3
Let be an automorphism. Show that if for some , then .
Problem 4
Show that the space of solutions to is isomorphic to .
Problem 5
If is an automorphism and is a subspace with , prove that is an automorphism of .
Problem 6
Let . Count the number of automorphisms such that (involutions).
Problem: Find the dimension of the space of trace-free matrices. To what is it isomorphic?
Solution:
A matrix is trace-free if .
This is one linear constraint on entries.
So .
Answer: Isomorphic to .
Problem: Show where = even polynomials, = odd polynomials.
Solution:
Any polynomial can be written as .
The first part is even, the second is odd.
If , then , so .
Thus .
Problem: Is isomorphic to ?
Solution:
Same dimension, so yes, they are isomorphic!
In fact, choosing bases gives an explicit isomorphism (the matrix representation).
Problem: Show where Sym = symmetric, Skew = skew-symmetric.
Solution:
Every matrix:
First part is symmetric, second is skew-symmetric.
If : .
,
Check: ✓
On the class of vector spaces over a fixed field , the relation (isomorphism) is an equivalence relation:
Reflexive: The identity map is an isomorphism.
Symmetric: If is an isomorphism, then is also an isomorphism.
Transitive: If and are isomorphisms, then is an isomorphism (composition of bijections is a bijection, composition of linear maps is linear).
The equivalence classes under are precisely the sets of spaces with the same dimension. For finite-dimensional spaces, each equivalence class has exactly one representative of the form .
The isomorphism allows us to switch between:
Many proofs work both ways; choose whichever is more convenient!
The solution space of a linear homogeneous ODE of order is isomorphic to .
Fourier transforms establish isomorphisms between:
In quantum mechanics, different "pictures" are related by isomorphisms:
If and are two bases for , the coordinate isomorphisms give:
This is the "change of basis matrix." It's an automorphism of .
For any linear map :
This generalizes Rank-Nullity and is fundamental in algebra.
For finite-dimensional :
The isomorphism is canonical (doesn't depend on basis), while requires choosing a basis.
| Concept | Definition/Result |
|---|---|
| Isomorphism | Bijective linear map |
| Automorphism | Isomorphism |
| Classification | ⟺ |
| Canonical form | where |
| Coordinate map | is an isomorphism |
| Inverse is linear | is automatically linear |
| Aut(V) | Group under composition, |
Before constructing an isomorphism, always check dimensions. If , no isomorphism exists—stop there!
To construct an isomorphism explicitly: pick bases for both spaces and define T to map one basis to the other. This always works when dimensions match!
When , you only need to check ONE of: injective, surjective, kernel = {0}, or rank = dim. They're all equivalent!
Isomorphic spaces have the same subspace lattice, same dimensions, same linear map behavior. Use this to transfer knowledge between spaces.
Problem: Let (3-dimensional space over the field with 2 elements). How many 1-dimensional subspaces does V have?
Hint: Count non-zero vectors, then group by span.
Problem: Prove that no isomorphism is "natural" (i.e., commutes with all automorphisms of V).
Hint: Consider what happens when you scale a basis vector.
Problem: Find all possible orders of automorphisms of (i.e., the smallest such that ).
Hint: Think about rotations.
An isomorphism is essentially choosing a coordinate system for V.
Automorphisms of are invertible linear transformations:
All invertible combinations of these are automorphisms.
Isomorphism is the fundamental equivalence relation in linear algebra:
The concept of isomorphism emerged in the 19th century as mathematicians began to understand that mathematical structures with the same "form" should be treated as equivalent.
Emmy Noether (1882-1935) was instrumental in developing the abstract approach to algebra that made isomorphism a central concept. Her work on invariant theory and abstract algebra laid the foundations for modern linear algebra.
The classification theorem—that dimension is the complete invariant—is a beautiful example of how abstract thinking can simplify mathematics. Instead of studying infinitely many different vector spaces, we only need to understand for each dimension .
With isomorphisms mastered, you're ready for:
The isomorphism concept will reappear constantly—it's the key to understanding when two mathematical objects are "the same" for structural purposes.
Isomorphic spaces are 'the same' for all linear algebra purposes. They have identical algebraic structure—same dimension, same types of subspaces, same linear maps. Only the 'names' of elements differ.
It means every finite-dimensional space can be studied using coordinates. Abstract theorems about V translate to concrete matrix computations in Fⁿ. This is the bridge between abstract and computational linear algebra.
No! It depends on the choice of basis. Different bases give different isomorphisms. The set of all isomorphisms V → Fⁿ is in bijection with the set of all ordered bases of V.
The situation is more subtle. Two infinite-dimensional spaces may have 'bases' of the same cardinality but behave very differently. Functional analysis handles these cases carefully.
A change of basis in V corresponds to composing with an automorphism of V. The matrix of T in a new basis is related to the old matrix by similarity: A' = P⁻¹AP.
An automorphism is an isomorphism from a space to itself (T: V → V bijective linear). The set of all automorphisms of V forms a group under composition, denoted Aut(V) or GL(V).
Isomorphisms preserve: linear independence, spanning, bases, dimension, subspace structure, rank of collections of vectors, solutions to linear equations—essentially all linear algebra properties.
As a linear map, yes (think of shearing R²). But algebraically, isomorphic spaces are indistinguishable. Geometry (lengths, angles) requires additional structure (inner products) beyond just linear algebra.
Show they have different dimensions! For finite-dimensional spaces over the same field, this is the only obstruction. If dimensions differ, no isomorphism can exist.
When we choose bases for V and W, an isomorphism T: V → W corresponds to an invertible matrix. Conversely, every invertible matrix defines an isomorphism. The groups GL(V) and GL_n(F) are isomorphic!