MathIsimple
LA-3.4
Available

Isomorphisms

An isomorphism is a bijective linear map—the strongest type of linear map. Isomorphic vector spaces are algebraically indistinguishable: they're "the same" for linear algebra.

3-4 hours Core Level 10 Objectives
Learning Objectives
  • Define isomorphism as bijective linear map
  • Characterize isomorphisms via kernel and image
  • Prove V ≅ Fⁿ for any n-dimensional V
  • Understand isomorphism as 'same algebraic structure'
  • Prove the Classification Theorem: same dimension ⟺ isomorphic
  • Understand the coordinate isomorphism and its significance
  • Recognize automorphisms (self-isomorphisms)
  • Apply isomorphisms to solve problems
  • Understand how isomorphisms preserve linear properties
  • Connect isomorphisms to change of basis
Prerequisites
  • Rank-Nullity Theorem (LA-3.3)
  • Kernel and image of linear maps (LA-3.2)
  • Basis and dimension (LA-2.4)
  • Injectivity and surjectivity characterizations
  • Linear maps and their properties (LA-3.1)
Historical Context

The concept of isomorphism is one of the most important in all of mathematics. It comes from the Greek words "isos" (equal) and "morphe" (form)—meaning "same form."

The classification theorem—that finite-dimensional spaces are isomorphic if and only if they have the same dimension—is remarkably elegant. It says that dimension is the only invariantthat matters. This is why we can reduce all finite-dimensional linear algebra to studying FnF^n.

1. Definition of Isomorphism

An isomorphism is the "perfect" linear map—it's linear, injective, and surjective.

Definition 3.8: Isomorphism

A linear map T:VWT: V \to W is an isomorphism if it is bijective (both injective and surjective).

If such a TT exists, we say VV and WW are isomorphic and write VWV \cong W.

Remark 3.12: Key Points
  • Isomorphism is a property of a map
  • Isomorphic is a relationship between spaces
  • In the context of linear maps, bijective = invertible
  • The inverse of an isomorphism is also an isomorphism
Definition 3.9: Automorphism

An automorphism of VV is an isomorphism T:VVT: V \to V (from VV to itself).

The set of all automorphisms of VV is denoted Aut(V)\text{Aut}(V) or GL(V)GL(V).

2. Characterizations of Isomorphisms

Theorem 3.35: Isomorphism Characterization

For a linear map T:VWT: V \to W with VV finite-dimensional, the following are equivalent:

  1. TT is an isomorphism
  2. TT is bijective
  3. kerT={0}\ker T = \{0\} and im T=W\text{im } T = W
  4. TT maps any basis of VV to a basis of WW
  5. dimV=dimW\dim V = \dim W and TT is injective
  6. dimV=dimW\dim V = \dim W and TT is surjective
  7. TT has an inverse T1:WVT^{-1}: W \to V that is also linear
Proof:

(1) ⟺ (2): By definition.

(2) ⟺ (3): Injective ⟺ kerT={0}\ker T = \{0\}; surjective ⟺ im T=W\text{im } T = W.

(3) ⟹ (4): Let {v1,,vn}\{v_1, \ldots, v_n\} be a basis of VV. Since TT is injective, {T(v1),,T(vn)}\{T(v_1), \ldots, T(v_n)\} is linearly independent. Since im T=W\text{im } T = W, these vectors span WW. Hence they form a basis.

(4) ⟹ (3): If TT maps a basis to a basis, then TT is clearly bijective.

(3) ⟺ (5) ⟺ (6): By Rank-Nullity, when dimV=dimW\dim V = \dim W: injective ⟺ surjective ⟺ bijective.

(2) ⟺ (7): TT has an inverse function iff TT is bijective. We must show T1T^{-1} is linear:

T1(w1+w2)=T1(T(v1)+T(v2))=T1(T(v1+v2))=v1+v2=T1(w1)+T1(w2)T^{-1}(w_1 + w_2) = T^{-1}(T(v_1) + T(v_2)) = T^{-1}(T(v_1 + v_2)) = v_1 + v_2 = T^{-1}(w_1) + T^{-1}(w_2)

Similarly for scalar multiplication.

3. The Classification Theorem

This is one of the most elegant theorems in linear algebra: dimension is the complete invariant.

Theorem 3.36: Classification by Dimension

Two finite-dimensional vector spaces over the same field FF are isomorphic if and only if they have the same dimension.

VW    dimV=dimWV \cong W \iff \dim V = \dim W
Proof:

(⟹) Necessity:

Suppose T:VWT: V \to W is an isomorphism. Then TT is injective, so kerT={0}\ker T = \{0\}. By Rank-Nullity:

dimV=dim(kerT)+dim(im T)=0+dimW=dimW\dim V = \dim(\ker T) + \dim(\text{im } T) = 0 + \dim W = \dim W

(⟸) Sufficiency:

Suppose dimV=dimW=n\dim V = \dim W = n. Let {v1,,vn}\{v_1, \ldots, v_n\} be a basis for VV and{w1,,wn}\{w_1, \ldots, w_n\} be a basis for WW.

Define T:VWT: V \to W by T(vi)=wiT(v_i) = w_i and extend linearly:

T(i=1naivi)=i=1naiwiT\left(\sum_{i=1}^{n} a_i v_i\right) = \sum_{i=1}^{n} a_i w_i

T is injective: If T(v)=0T(v) = 0, write v=aiviv = \sum a_i v_i. Thenaiwi=0\sum a_i w_i = 0. Since {wi}\{w_i\} is a basis, all ai=0a_i = 0, so v=0v = 0.

T is surjective: Any w=biwiWw = \sum b_i w_i \in W equals T(bivi)T(\sum b_i v_i).

Corollary 3.6: Canonical Isomorphism

Every nn-dimensional vector space VV over FF is isomorphic to FnF^n:

VFnV \cong F^n
Remark 3.13: Significance

This corollary is profound: it means we can study any finite-dimensional space using coordinates! Abstract vector spaces (polynomials, matrices, functions) can all be reduced to FnF^n.

4. The Coordinate Isomorphism

The most important isomorphism: the coordinate map with respect to a chosen basis.

Definition 3.10: Coordinate Map

Let B={v1,,vn}B = \{v_1, \ldots, v_n\} be an ordered basis for VV. The coordinate map with respect to BB is:

ϕB:VFn,v=i=1naivi(a1,,an)\phi_B: V \to F^n, \quad v = \sum_{i=1}^{n} a_i v_i \mapsto (a_1, \ldots, a_n)

The tuple (a1,,an)(a_1, \ldots, a_n) is called the coordinate vector of vv with respect to BB.

Theorem 3.37: Coordinate Map is an Isomorphism

For any ordered basis BB of VV, the coordinate map ϕB:VFn\phi_B: V \to F^n is an isomorphism.

Proof:

Linearity: If v=aiviv = \sum a_i v_i and w=biviw = \sum b_i v_i, then:

ϕB(v+w)=ϕB((ai+bi)vi)=(a1+b1,,an+bn)=ϕB(v)+ϕB(w)\phi_B(v + w) = \phi_B\left(\sum (a_i + b_i) v_i\right) = (a_1 + b_1, \ldots, a_n + b_n) = \phi_B(v) + \phi_B(w)

Similarly for scalar multiplication.

Bijectivity: Every vector has unique coordinates (by definition of basis), so ϕB\phi_B is bijective.

Example 3.20: Polynomial Coordinates

Problem: Find the coordinate isomorphism from P2P_2 to R3\mathbb{R}^3.

Solution: Using the standard basis B={1,x,x2}B = \{1, x, x^2\}:

ϕB:P2R3,a+bx+cx2(a,b,c)\phi_B: P_2 \to \mathbb{R}^3, \quad a + bx + cx^2 \mapsto (a, b, c)

For example: ϕB(32x+5x2)=(3,2,5)\phi_B(3 - 2x + 5x^2) = (3, -2, 5)

5. Properties Preserved by Isomorphisms

Isomorphisms preserve all linear algebra properties. This is what makes isomorphic spaces "the same."

Theorem 3.38: Isomorphisms Preserve Linear Independence

Let T:VWT: V \to W be an isomorphism. Then:

  • {v1,,vk}\{v_1, \ldots, v_k\} is linearly independent ⟺ {T(v1),,T(vk)}\{T(v_1), \ldots, T(v_k)\} is linearly independent
  • {v1,,vk}\{v_1, \ldots, v_k\} spans VV{T(v1),,T(vk)}\{T(v_1), \ldots, T(v_k)\} spans WW
  • {v1,,vk}\{v_1, \ldots, v_k\} is a basis ⟺ {T(v1),,T(vk)}\{T(v_1), \ldots, T(v_k)\} is a basis
Proof:

Suppose {v1,,vk}\{v_1, \ldots, v_k\} is linearly independent. If ciT(vi)=0\sum c_i T(v_i) = 0, then:

T(civi)=0civikerT={0}T\left(\sum c_i v_i\right) = 0 \Rightarrow \sum c_i v_i \in \ker T = \{0\}

So civi=0\sum c_i v_i = 0, and by independence of {vi}\{v_i\}, all ci=0c_i = 0.

The converse follows by applying the same argument to T1T^{-1}.

Theorem 3.39: Rank Preservation

If T:VWT: V \to W is an isomorphism and S={v1,,vk}VS = \{v_1, \ldots, v_k\} \subseteq V, then:

rank(S)=rank(T(S))\text{rank}(S) = \text{rank}(T(S))

where T(S)={T(v1),,T(vk)}T(S) = \{T(v_1), \ldots, T(v_k)\}.

Theorem 3.40: Subspace Preservation

If T:VWT: V \to W is an isomorphism:

  • UVU \subseteq V is a subspace ⟺ T(U)WT(U) \subseteq W is a subspace
  • dimU=dimT(U)\dim U = \dim T(U)
  • The lattice of subspaces is preserved

6. Examples

Example 3.21: Complex Numbers as R²

Problem: Show C\mathbb{C} (as a real vector space) is isomorphic to R2\mathbb{R}^2.

Solution: Both have dimension 2 over R\mathbb{R}.

Define T:CR2T: \mathbb{C} \to \mathbb{R}^2 by T(a+bi)=(a,b)T(a + bi) = (a, b).

  • Linear: T((a+bi)+(c+di))=T((a+c)+(b+d)i)=(a+c,b+d)=(a,b)+(c,d)T((a+bi) + (c+di)) = T((a+c) + (b+d)i) = (a+c, b+d) = (a,b) + (c,d)
  • Injective: T(a+bi)=(0,0)a=b=0T(a+bi) = (0,0) \Rightarrow a = b = 0
  • Surjective: (a,b)=T(a+bi)(a,b) = T(a+bi) for any (a,b)R2(a,b) \in \mathbb{R}^2
Example 3.22: Matrices and Polynomials

Problem: Are M2×2(R)M_{2 \times 2}(\mathbb{R}) and P3(R)P_3(\mathbb{R}) isomorphic?

Solution: Yes! Both have dimension 4.

M2×2M_{2 \times 2}: basis {E11,E12,E21,E22}\{E_{11}, E_{12}, E_{21}, E_{22}\}, dimension 4.

P3P_3: basis {1,x,x2,x3}\{1, x, x^2, x^3\}, dimension 4.

One isomorphism:

(abcd)a+bx+cx2+dx3\begin{pmatrix} a & b \\ c & d \end{pmatrix} \mapsto a + bx + cx^2 + dx^3
Example 3.23: Non-Isomorphic Spaces

Problem: Are R3\mathbb{R}^3 and R4\mathbb{R}^4 isomorphic?

Solution: No! They have different dimensions (3 ≠ 4).

By the Classification Theorem, no isomorphism can exist.

Example 3.24: Differentiation is Not an Isomorphism

Problem: Is differentiation D:P3P2D: P_3 \to P_2 an isomorphism?

Solution: No, for two reasons:

  • kerD={constants}{0}\ker D = \{\text{constants}\} \neq \{0\}, so D is not injective
  • dimP3=43=dimP2\dim P_3 = 4 \neq 3 = \dim P_2

7. The Automorphism Group

The automorphisms of a space form an important algebraic structure: a group.

Theorem 3.41: Aut(V) is a Group

The set Aut(V)\text{Aut}(V) of all automorphisms of VV forms a group under composition:

  • Closure: If S,TAut(V)S, T \in \text{Aut}(V), then STAut(V)S \circ T \in \text{Aut}(V)
  • Identity: The identity map IVAut(V)I_V \in \text{Aut}(V)
  • Inverses: If TAut(V)T \in \text{Aut}(V), then T1Aut(V)T^{-1} \in \text{Aut}(V)
  • Associativity: Composition is associative
Theorem 3.42: GL(V) ≅ GL_n(F)

For an nn-dimensional space VV over FF:

Aut(V)GLn(F)\text{Aut}(V) \cong GL_n(F)

where GLn(F)GL_n(F) is the group of invertible n×nn \times n matrices over FF.

Remark 3.14: Counting Automorphisms

For VV of dimension nn over a finite field Fq\mathbb{F}_q:

GLn(Fq)=(qn1)(qnq)(qnq2)(qnqn1)|GL_n(\mathbb{F}_q)| = (q^n - 1)(q^n - q)(q^n - q^2) \cdots (q^n - q^{n-1})

8. Worked Examples

Constructing an Isomorphism

Problem: Construct an explicit isomorphism T:P1R2T: P_1 \to \mathbb{R}^2.

Solution:

Choose basis B={1,x}B = \{1, x\} for P1P_1 and standard basis for R2\mathbb{R}^2.

Define T(1)=(1,0)T(1) = (1, 0) and T(x)=(0,1)T(x) = (0, 1).

Then T(a+bx)=a(1,0)+b(0,1)=(a,b)T(a + bx) = a(1,0) + b(0,1) = (a, b).

Verification: T maps basis to basis, so T is an isomorphism.

Symmetric Matrices

Problem: To what Rk\mathbb{R}^k is the space of symmetric 2×22 \times 2 matrices isomorphic?

Solution:

A symmetric 2×22 \times 2 matrix has the form:

(abbc)\begin{pmatrix} a & b \\ b & c \end{pmatrix}

This requires 3 independent parameters. Dimension = 3.

Answer: Sym2(R)R3\text{Sym}_2(\mathbb{R}) \cong \mathbb{R}^3

Is This an Isomorphism?

Problem: Is T:R2R2T: \mathbb{R}^2 \to \mathbb{R}^2 with T(x,y)=(x+y,xy)T(x, y) = (x + y, x - y) an isomorphism?

Solution:

Check injectivity: T(x,y)=(0,0)x+y=0T(x,y) = (0,0) \Rightarrow x + y = 0 and xy=0x - y = 0.

Adding: 2x=0x=02x = 0 \Rightarrow x = 0. Subtracting: 2y=0y=02y = 0 \Rightarrow y = 0.

So kerT={(0,0)}\ker T = \{(0,0)\}.

Since dimR2=dimR2\dim \mathbb{R}^2 = \dim \mathbb{R}^2 and TT is injective, TT is an isomorphism.

9. Common Mistakes

Mistake 1: Same Dimension Means Equal

VWV \cong W does NOT mean V=WV = W! Isomorphic spaces have the same algebraic structure but may consist of completely different objects (polynomials vs. matrices).

Mistake 2: Confusing Isomorphism with Equality

An isomorphism T:VWT: V \to W is a function, not an equality. There are typically infinitely many different isomorphisms between isomorphic spaces.

Mistake 3: The Coordinate Isomorphism is "Natural"

The isomorphism VFnV \cong F^n depends on the choice of basis! Different bases give different isomorphisms. There is no "canonical" choice.

Mistake 4: Infinite Dimensions

The Classification Theorem requires finite dimensions! In infinite dimensions, two spaces can have "the same" dimension but not be isomorphic.

10. Key Takeaways

Definition

Isomorphism = bijective linear map

Classification

VWV \cong WdimV=dimW\dim V = \dim W

Canonical Result

Every nn-dim space ≅ FnF^n

Preservation

Isomorphisms preserve all linear algebra properties

11. Additional Practice Problems

Problem 1

Find all vector spaces (up to isomorphism) of dimension 3 over R\mathbb{R}. How many are there?

Problem 2

Prove that the space of n×nn \times n upper triangular matrices is isomorphic toRn(n+1)/2\mathbb{R}^{n(n+1)/2}.

Problem 3

Let T:VVT: V \to V be an automorphism. Show that if Tn=IT^n = I for some n1n \geq 1, then V=ker(TI)ker(Tn1+Tn2++T+I)V = \ker(T - I) \oplus \ker(T^{n-1} + T^{n-2} + \cdots + T + I).

Problem 4

Show that the space of solutions to y+y=0y'' + y = 0 is isomorphic to R2\mathbb{R}^2.

Problem 5

If T:VVT: V \to V is an automorphism and UVU \subseteq V is a subspace with T(U)=UT(U) = U, prove that TU:UUT|_U: U \to U is an automorphism of UU.

Problem 6

Let V=R2V = \mathbb{R}^2. Count the number of automorphisms T:VVT: V \to V such thatT2=IT^2 = I (involutions).

12. More Worked Examples

Example: Trace-Free Matrices

Problem: Find the dimension of the space of n×nn \times n trace-free matrices. To what Rk\mathbb{R}^k is it isomorphic?

Solution:

A matrix is trace-free if tr(A)=a11+a22++ann=0\text{tr}(A) = a_{11} + a_{22} + \cdots + a_{nn} = 0.

This is one linear constraint on n2n^2 entries.

So dim(trace-free matrices)=n21\dim(\text{trace-free matrices}) = n^2 - 1.

Answer: Isomorphic to Rn21\mathbb{R}^{n^2 - 1}.

Example: Even vs Odd Polynomials

Problem: Show Pn=EnOnP_n = E_n \oplus O_n where EnE_n = even polynomials, OnO_n = odd polynomials.

Solution:

Any polynomial can be written as p(x)=p(x)+p(x)2+p(x)p(x)2p(x) = \frac{p(x) + p(-x)}{2} + \frac{p(x) - p(-x)}{2}.

The first part is even, the second is odd.

If pEnOnp \in E_n \cap O_n, then p(x)=p(x)=p(x)p(x) = p(-x) = -p(x), so p=0p = 0.

Thus Pn=EnOnP_n = E_n \oplus O_n.

Example: Matrix Spaces

Problem: Is L(R2,R3)L(\mathbb{R}^2, \mathbb{R}^3) isomorphic to M3×2(R)M_{3 \times 2}(\mathbb{R})?

Solution:

dimL(R2,R3)=23=6\dim L(\mathbb{R}^2, \mathbb{R}^3) = 2 \cdot 3 = 6

dimM3×2=32=6\dim M_{3 \times 2} = 3 \cdot 2 = 6

Same dimension, so yes, they are isomorphic!

In fact, choosing bases gives an explicit isomorphism (the matrix representation).

Example: Direct Sum Decomposition

Problem: Show MnSymnSkewnM_n \cong \text{Sym}_n \oplus \text{Skew}_n where Sym = symmetric, Skew = skew-symmetric.

Solution:

Every matrix: A=A+AT2+AAT2A = \frac{A + A^T}{2} + \frac{A - A^T}{2}

First part is symmetric, second is skew-symmetric.

If ASymSkewA \in \text{Sym} \cap \text{Skew}: A=AT=AA=0A = A^T = -A \Rightarrow A = 0.

dimSymn=n(n+1)2\dim \text{Sym}_n = \frac{n(n+1)}{2}, dimSkewn=n(n1)2\dim \text{Skew}_n = \frac{n(n-1)}{2}

Check: n(n+1)2+n(n1)2=n2=dimMn\frac{n(n+1)}{2} + \frac{n(n-1)}{2} = n^2 = \dim M_n

13. Isomorphism as Equivalence Relation

Theorem 3.43: Isomorphism is an Equivalence Relation

On the class of vector spaces over a fixed field FF, the relation \cong (isomorphism) is an equivalence relation:

  • Reflexive: VVV \cong V (via identity map)
  • Symmetric: VWWVV \cong W \Rightarrow W \cong V (via inverse)
  • Transitive: VWV \cong W and WUVUW \cong U \Rightarrow V \cong U (via composition)
Proof:

Reflexive: The identity map IV:VVI_V: V \to V is an isomorphism.

Symmetric: If T:VWT: V \to W is an isomorphism, then T1:WVT^{-1}: W \to V is also an isomorphism.

Transitive: If T:VWT: V \to W and S:WUS: W \to U are isomorphisms, then ST:VUS \circ T: V \to U is an isomorphism (composition of bijections is a bijection, composition of linear maps is linear).

Remark 3.15: Equivalence Classes

The equivalence classes under \cong are precisely the sets of spaces with the same dimension. For finite-dimensional spaces, each equivalence class has exactly one representative of the form FnF^n.

14. Applications

Coordinate-Free vs. Coordinate Methods

The isomorphism VFnV \cong F^n allows us to switch between:

  • Coordinate-free: Abstract proofs about V (elegant, basis-independent)
  • Coordinate methods: Matrix computations in FnF^n (computational, explicit)

Many proofs work both ways; choose whichever is more convenient!

Differential Equations

The solution space of a linear homogeneous ODE L(y)=0L(y) = 0 of order nn is isomorphic to Rn\mathbb{R}^n.

  • Dimension = order of the equation
  • Basis = nn linearly independent solutions
  • General solution = linear combination of basis solutions
Signal Processing

Fourier transforms establish isomorphisms between:

  • Time-domain signals and frequency-domain representations
  • Convolution in one domain ↔ multiplication in the other
  • This is why Fourier analysis is so powerful!
Quantum Mechanics

In quantum mechanics, different "pictures" are related by isomorphisms:

  • Schrödinger picture ↔ Heisenberg picture
  • Position representation ↔ Momentum representation
  • All give equivalent predictions (physics is invariant under isomorphism)

15. Connections to Other Topics

Change of Basis

If BB and BB' are two bases for VV, the coordinate isomorphisms give:

ϕBϕB1:FnFn\phi_{B'} \circ \phi_B^{-1}: F^n \to F^n

This is the "change of basis matrix." It's an automorphism of FnF^n.

First Isomorphism Theorem

For any linear map T:VWT: V \to W:

V/kerTim TV / \ker T \cong \text{im } T

This generalizes Rank-Nullity and is fundamental in algebra.

Dual Spaces

For finite-dimensional VV:

VVVV \cong V^* \cong V^{**}

The isomorphism VVV \cong V^{**} is canonical (doesn't depend on basis), while VVV \cong V^* requires choosing a basis.

16. Quick Reference Summary

ConceptDefinition/Result
IsomorphismBijective linear map
AutomorphismIsomorphism T:VVT: V \to V
ClassificationVWV \cong WdimV=dimW\dim V = \dim W
Canonical formVFnV \cong F^n where n=dimVn = \dim V
Coordinate mapϕB:VFn\phi_B: V \to F^n is an isomorphism
Inverse is linearT1T^{-1} is automatically linear
Aut(V)Group under composition, GLn(F)\cong GL_n(F)

17. Study Tips

Tip 1: Check Dimensions First

Before constructing an isomorphism, always check dimensions. If dimVdimW\dim V \neq \dim W, no isomorphism exists—stop there!

Tip 2: Use the Basis-to-Basis Trick

To construct an isomorphism explicitly: pick bases for both spaces and define T to map one basis to the other. This always works when dimensions match!

Tip 3: Equal Dimensions Make Life Easy

When dimV=dimW\dim V = \dim W, you only need to check ONE of: injective, surjective, kernel = {0}, or rank = dim. They're all equivalent!

Tip 4: Think Structurally

Isomorphic spaces have the same subspace lattice, same dimensions, same linear map behavior. Use this to transfer knowledge between spaces.

18. Challenge Problems

Challenge 1: Counting Subspaces

Problem: Let V=F23V = \mathbb{F}_2^3 (3-dimensional space over the field with 2 elements). How many 1-dimensional subspaces does V have?

Hint: Count non-zero vectors, then group by span.

Challenge 2: Non-Canonical Isomorphism

Problem: Prove that no isomorphism T:VVT: V \to V^* is "natural" (i.e., commutes with all automorphisms of V).

Hint: Consider what happens when you scale a basis vector.

Challenge 3: Automorphism Order

Problem: Find all possible orders of automorphisms of R2\mathbb{R}^2 (i.e., the smallest nn such that Tn=IT^n = I).

Hint: Think about rotations.

19. Geometric Interpretation

Isomorphisms as Coordinate Systems

An isomorphism ϕB:VFn\phi_B: V \to F^n is essentially choosing a coordinate system for V.

  • The basis BB determines "where the axes point"
  • Different bases give different coordinate systems
  • But all describe the same underlying space
Automorphisms as Transformations

Automorphisms of Rn\mathbb{R}^n are invertible linear transformations:

  • Rotations: preserve lengths and angles
  • Reflections: flip across a hyperplane
  • Shears: slide parallel to a direction
  • Scalings: stretch or shrink (non-zero factor)

All invertible combinations of these are automorphisms.

The Big Picture

Isomorphism is the fundamental equivalence relation in linear algebra:

  • Isomorphic spaces are "the same" algebraically
  • Only dimension matters (for finite-dimensional spaces)
  • All properties preserved under isomorphism are "linear algebra properties"
  • Geometric properties (lengths, angles) require additional structure

20. Historical Note

The concept of isomorphism emerged in the 19th century as mathematicians began to understand that mathematical structures with the same "form" should be treated as equivalent.

Emmy Noether (1882-1935) was instrumental in developing the abstract approach to algebra that made isomorphism a central concept. Her work on invariant theory and abstract algebra laid the foundations for modern linear algebra.

The classification theorem—that dimension is the complete invariant—is a beautiful example of how abstract thinking can simplify mathematics. Instead of studying infinitely many different vector spaces, we only need to understand FnF^n for each dimension nn.

What's Next?

With isomorphisms mastered, you're ready for:

  • Dual Spaces: The space of linear functionals V* and the double dual
  • Matrix Representation: How isomorphisms connect linear maps to matrices
  • Change of Basis: How matrices transform under isomorphisms
  • Quotient Spaces: V/U and the First Isomorphism Theorem

The isomorphism concept will reappear constantly—it's the key to understanding when two mathematical objects are "the same" for structural purposes.

Isomorphisms Practice
12
Questions
0
Correct
0%
Accuracy
1
If T:VWT: V \to W is an isomorphism, what is dim(kerT)\dim(\ker T)?
Easy
Not attempted
2
Are R2\mathbb{R}^2 and C\mathbb{C} isomorphic as real vector spaces?
Medium
Not attempted
3
If dimV=dimW\dim V = \dim W, is every linear T:VWT: V \to W an isomorphism?
Easy
Not attempted
4
Are P2P_2 and R3\mathbb{R}^3 isomorphic?
Medium
Not attempted
5
If TT is an isomorphism, is T1T^{-1} linear?
Medium
Not attempted
6
How many isomorphisms are there from R2\mathbb{R}^2 to R2\mathbb{R}^2?
Medium
Not attempted
7
If VWV \cong W and WUW \cong U, then:
Easy
Not attempted
8
Two finite-dim spaces over the same field are isomorphic iff:
Medium
Not attempted
9
The automorphism group of Rn\mathbb{R}^n is isomorphic to:
Hard
Not attempted
10
If T:VWT: V \to W is an isomorphism and UVU \subseteq V is a subspace, then:
Medium
Not attempted
11
The coordinate map ϕB:VFn\phi_B: V \to F^n with respect to basis BB is:
Easy
Not attempted
12
If T:VVT: V \to V is an automorphism and T2=IT^2 = I, then TT is called:
Medium
Not attempted

Frequently Asked Questions

What does 'isomorphic' really mean?

Isomorphic spaces are 'the same' for all linear algebra purposes. They have identical algebraic structure—same dimension, same types of subspaces, same linear maps. Only the 'names' of elements differ.

Why is V ≅ Fⁿ so important?

It means every finite-dimensional space can be studied using coordinates. Abstract theorems about V translate to concrete matrix computations in Fⁿ. This is the bridge between abstract and computational linear algebra.

Is the isomorphism V ≅ Fⁿ unique?

No! It depends on the choice of basis. Different bases give different isomorphisms. The set of all isomorphisms V → Fⁿ is in bijection with the set of all ordered bases of V.

What about infinite-dimensional spaces?

The situation is more subtle. Two infinite-dimensional spaces may have 'bases' of the same cardinality but behave very differently. Functional analysis handles these cases carefully.

How do isomorphisms relate to change of basis?

A change of basis in V corresponds to composing with an automorphism of V. The matrix of T in a new basis is related to the old matrix by similarity: A' = P⁻¹AP.

What is an automorphism?

An automorphism is an isomorphism from a space to itself (T: V → V bijective linear). The set of all automorphisms of V forms a group under composition, denoted Aut(V) or GL(V).

What properties do isomorphisms preserve?

Isomorphisms preserve: linear independence, spanning, bases, dimension, subspace structure, rank of collections of vectors, solutions to linear equations—essentially all linear algebra properties.

Can an isomorphism change the 'shape' of a space?

As a linear map, yes (think of shearing R²). But algebraically, isomorphic spaces are indistinguishable. Geometry (lengths, angles) requires additional structure (inner products) beyond just linear algebra.

How do I show two spaces are NOT isomorphic?

Show they have different dimensions! For finite-dimensional spaces over the same field, this is the only obstruction. If dimensions differ, no isomorphism can exist.

What's the relationship between isomorphisms and invertible matrices?

When we choose bases for V and W, an isomorphism T: V → W corresponds to an invertible matrix. Conversely, every invertible matrix defines an isomorphism. The groups GL(V) and GL_n(F) are isomorphic!