Direct sums combine spaces, quotients collapse subspaces. These constructions are fundamental for understanding the structure of linear maps.
Direct sums and quotient spaces emerged in the late 19th and early 20th centuries as mathematicians sought to understand the structure of vector spaces. Emmy Noether(1882-1935) was instrumental in developing the abstract algebraic approach, where quotients became a central tool. The First Isomorphism Theorem, which connects kernels and images via quotients, appeared first in group theory and was later adapted to vector spaces and modules. These constructions are now fundamental in representation theory, homological algebra, and functional analysis.
A direct sum decomposes a vector space into "independent" pieces. Just as integers factor into primes, vector spaces can sometimes be written as direct sums of simpler subspaces. This decomposition is fundamental for understanding linear operators.
Let be subspaces of . The sum is:
This is the smallest subspace containing both and .
We say is the internal direct sum of subspaces and , written:
if both conditions hold:
Direct sum means:
The condition prevents ambiguity.
if and only if every can be written uniquely as:
(⇒) Suppose .
Existence: Since , every equals some .
Uniqueness: If , then:
So and .
(⇐) If decomposition is always unique:
Sum: Every is a sum, so .
Intersection: If , then are two decompositions. By uniqueness, .
In , let:
Then because:
Uniqueness: has only one decomposition as x-component + y-component.
In , let and (line ).
Then , but is this direct?
Check intersection: requires and , so .
. Yes, it's a direct sum!
A subspace is a complement of in if .
We also say and are complementary subspaces.
Every subspace of a finite-dimensional space has a complement.
Let be a basis of .
Extend to a basis of .
Let .
Claim: .
Sum: Any is a combination of all basis vectors, so with , .
Intersection: If , then is a combination of 's and of 's. By independence of the full basis, all coefficients are 0.
Warning: Complements are NOT unique! In , the x-axis has infinitely many complements: any non-horizontal line through the origin.
In , let . Complements of include:
If , then:
By the dimension formula for sums:
Since , we have .
Thus .
Given vector spaces over the same field (not necessarily subspaces of a common space), the external direct sum is:
with operations and .
Internal: Subspaces of the same space, combined to give the whole space.
External: Separate spaces, combined into a larger product space.
When (internal), the map is an isomorphism (external to internal).
consists of pairs .
Dimension: .
This is isomorphic to via .
If , the projection onto U along W is:
where each is uniquely written as .
If is the projection onto along where :
In :
Projection onto x-axis:
Projection onto y-axis:
Note: and .
More generally, means:
Equivalently: every has a unique representation .
If , then:
Let have eigenvalues with 1-dimensional eigenspaces.
Then:
This decomposition is the key to diagonalization!
If with projections onto :
If and is a basis for , then:
is a basis for . The bases "add up" nicely.
For (n×n real matrices):
Any matrix uniquely decomposes as:
symmetric part + skew-symmetric part.
Dimension check: ✓
A quotient space "collapses" a subspace to a point, treating vectors that differ by an element of the subspace as equivalent. This construction is fundamental for the First Isomorphism Theorem and for understanding the structure of linear maps.
Let be a subspace of and . The coset of modulo is:
This is a "shifted copy" of passing through .
If is a line through the origin, cosets are parallel lines:
All points on the same parallel line are "equivalent" in the quotient.
Define if and only if .
This is an equivalence relation:
The quotient space is the set of all cosets:
with vector space operations:
The operations on are well-defined: they don't depend on the choice of coset representatives.
Suppose and .
Then and .
Addition: .
So . ✓
Scalar multiplication: .
So . ✓
is a vector space over with:
If is finite-dimensional:
Let be a basis of .
Extend to a basis of .
Claim: is a basis of .
Spanning: Any . Then:
Independence: If , then .
So for some .
By independence of the full basis, all .
Thus .
Let (x-axis) in .
Cosets: = horizontal line at height .
Key insight: Two points are equivalent iff they have the same y-coordinate.
Dimension: .
The quotient via .
Let (xy-plane).
Cosets are horizontal planes at different heights:
Points are equivalent iff they have the same z-coordinate.
(1-dimensional).
Let (x-axis).
Cosets are lines parallel to the x-axis:
Points are equivalent iff they have the same y and z coordinates.
(2-dimensional).
The canonical projection (or quotient map) is:
This map is linear, surjective, with .
If is a linear map, then:
The isomorphism is .
Well-defined: If , then .
So , hence . ✓
Linear: . ✓
Injective: If , then , so .
Thus , the zero in . ✓
Surjective: For any , we have . ✓
This theorem says:
Let be .
Kernel: and , so .
Image: is surjective onto (check!).
By First Isomorphism Theorem: .
Dimension check: . ✓
If is a subspace of and is any complement of , then:
Define by .
Linear: Clear from definition.
Injective: If , then . But and , so .
Surjective: Any coset can be written where , .
Then .
Direct sums and quotients are "dual" constructions:
Consider modulo (polynomials with zero constant term).
, so .
.
Two polynomials are equivalent iff .
The quotient via .
Let (a plane in ).
, so .
Points are equivalent iff they differ by a vector in .
Equivalently: iff .
So via .
Problem: Is where and ?
Solution:
Sum: Any ? Yes!
Intersection: requires .
So , giving .
Answer: Yes, .
Dimension check: , , . ✓
Problem: Find a complement of in .
Solution: Need with and .
Extend to a basis. Add (independent).
Add (independent from both).
Let .
Check: gives , , .
So . . ✓
Problem: In where , compute .
Solution:
What does represent? All vectors differing from by an element of :
Problem: Let be (derivative).
Describe .
Solution:
Kernel: means is constant. So (constants).
Dimension: .
Image: (all derivatives of quadratics are linear).
By First Isomorphism Theorem: .
Interpretation: Two polynomials are equivalent iff they differ by a constant.
Problem: Verify where , , are the coordinate axes.
Solution:
Dimension check: . ✓
Problem: Find the projection matrix onto along .
Solution: Any .
Solving: , .
Projection onto : .
Check: ✓
Problem: Find a basis for where .
Solution:
.
Extend W's basis: add and .
Basis of quotient: .
Any has .
Problem: Show but where .
Solution:
Sum: .
Actually, , so it doesn't even span!
Also .
Conclusion: Not a direct sum (fails both conditions).
Problem: Describe .
Solution:
, .
.
Two matrices are equivalent iff they have the same skew-symmetric part.
Isomorphism: .
Problem: Let . What is ?
Solution: By the Third Isomorphism Theorem:
Example: , , .
.
Problem: Show that continuous functions decompose as even + odd.
Solution: Define:
Any decomposes as:
The first part is even, the second is odd.
Intersection: If is both even and odd, then , so .
Problem: In , let and . Is ?
Solution:
, .
If , then .
Check intersection: Solve :
.
Answer: No, .
is NOT the same as ! The sum always exists, but direct sum requires .
A coset (with ) is NOT a subspace! It doesn't contain 0 (unless ). Cosets are "shifted" subspaces.
Complements are NOT unique! In , any line (except the x-axis) complements the x-axis. There are infinitely many choices.
, NOT ! The quotient "removes" the dimension of .
When defining operations on quotients, you must verify the result doesn't depend on the choice of representative. This is essential!
is NOT a subspace of ! It's a different vector space whose elements are cosets. However, for any complement .
Direct sum IS commutative: (both equal ). But be careful with external direct sum ordering when identifying with .
For , it's NOT enough that . You also need , etc.
✓ For Direct Sum V = U ⊕ W:
✓ For Quotient V/W:
✓ For Well-Definedness:
means every vector has a unique representation as .
But complements are not unique—there are typically infinitely many.
treats vectors differing by elements of as identical.
—the quotient "measures" the image.
,
gives projections , with .
| Concept | Notation | Key Property |
|---|---|---|
| Sum of subspaces | U + W | Smallest subspace containing both |
| Direct sum | U ⊕ W | Sum with U ∩ W = {0} |
| Coset | v + W | Shifted copy of W |
| Quotient space | V/W | Set of all cosets |
| Canonical projection | π: V → V/W | π(v) = v + W, ker(π) = W |
A diagonalizable operator decomposes as a direct sum of eigenspaces:.
The First Isomorphism Theorem gives:.
Signals decompose into frequency components: direct sum of eigenspaces of shift operators (Fourier decomposition).
Quotients define cohomology groups:.
Every linear operator on a complex vector space decomposes as:
where is diagonalizable, is nilpotent, and .
This requires understanding generalized eigenspaces and their direct sum structure.
Consider continuous functions modulo constants:
Two functions are equivalent iff they differ by a constant.
This is useful when we only care about "shape" of functions, not vertical shift.
Direct sums and quotients are the two fundamental ways to build/decompose vector spaces:
Together, they form the foundation for structural theorems in linear algebra.
If has characteristic polynomial , then:
These are the generalized eigenspaces. This decomposition is key to Jordan normal form.
Affine space can be viewed as a quotient:
Points in projective space are lines through origin in .
Homogeneous coordinates: where .
| Area | Construction | Application |
|---|---|---|
| Spectral Theory | Direct sum of eigenspaces | Diagonalization |
| Linear Maps | V/ker(T) | Rank-nullity theorem |
| Signal Processing | Frequency decomposition | Fourier analysis |
| Matrix Theory | Symmetric ⊕ Skew | Matrix classification |
| Topology | Cohomology = ker/im | Invariants of spaces |
This concludes Part II. We've built the foundation: vector spaces, subspaces, linear independence, basis, dimension, direct sums, and quotients. In Part III: Linear Mappings, we'll study functions between vector spaces that preserve the structure—these connect everything together.
You've learned:
Next: Part III - Linear Mappings will show how these structures interact!
Input: Direct sum , vector
Input: Subspace of
In Part III, you'll learn:
Think of vector spaces like LEGO structures:
Direct Sum (Building)
Quotient (Collapsing)
| Theorem | Statement |
|---|---|
| Direct Sum Characterization | ⟺ unique decomposition |
| Existence of Complements | Every subspace has a complement |
| Direct Sum Dimension | |
| Quotient Dimension | |
| First Isomorphism | |
| Quotient ≅ Complement | for any complement |
V/W 'collapses' W to a point. Elements of V/W are parallel copies of W. If W is a line through origin, V/W contains all lines parallel to W, each as a single 'point' in the quotient.
A complement U is a subspace of V with V = W ⊕ U. The quotient V/W is a different space altogether. But V/W ≅ U for any complement U.
They're essential for the First Isomorphism Theorem: if T: V → W is linear, then V/ker(T) ≅ im(T). Quotients also appear in defining cosets in group theory and factor rings in algebra.
No! A subspace typically has infinitely many complements. For example, in ℝ², any line not equal to the x-axis is a complement of the x-axis.
Given spaces V and W (not necessarily subspaces of a common space), V ⊕ W is the set of pairs (v, w) with componentwise operations. Its dimension is dim(V) + dim(W).
Check two things: (1) U + W = V (every vector is a sum), and (2) U ∩ W = {0} (only overlap is zero). Equivalently, show every v has a UNIQUE decomposition as u + w.
If V = U ⊕ W, the projection onto U along W is the map π: V → U defined by π(u + w) = u. It satisfies π² = π (idempotent) and ker(π) = W, im(π) = U.
Operations on V/W are defined using representatives: [v] + [u] = [v + u]. We must verify this doesn't depend on which representatives we choose—different v', u' with [v'] = [v], [u'] = [u] must give [v' + u'] = [v + u].
For finite families of spaces, the direct sum and direct product coincide. For infinite families, they differ: direct sum requires only finitely many non-zero components, while products allow all components to be non-zero.
If a linear operator has enough eigenvectors, V decomposes as a direct sum of eigenspaces: V = E_{λ₁} ⊕ E_{λ₂} ⊕ ... This is the foundation of diagonalization.