MathIsimple
LA-2.5
Available

Direct Sums & Quotient Spaces

Direct sums combine spaces, quotients collapse subspaces. These constructions are fundamental for understanding the structure of linear maps.

Learning Objectives
  • Define internal and external direct sums
  • Prove the characterization of direct sums via unique decomposition
  • Find complementary subspaces and understand their non-uniqueness
  • Construct quotient spaces V/W and verify well-definedness of operations
  • Compute dimension of quotient spaces using the dimension formula
  • Understand cosets geometrically as parallel affine subspaces
  • Apply direct sums to decompose linear maps
  • Use the first isomorphism theorem for vector spaces
  • Recognize projections as maps arising from direct sum decompositions
  • Connect quotients to equivalence relations and partitions
Prerequisites
  • Subspaces and the subspace criterion (LA-2.2)
  • Linear independence and spanning (LA-2.3)
  • Basis and dimension (LA-2.4)
  • Linear combinations and linear span
  • Basic set theory (equivalence relations)

Historical Context

Direct sums and quotient spaces emerged in the late 19th and early 20th centuries as mathematicians sought to understand the structure of vector spaces. Emmy Noether(1882-1935) was instrumental in developing the abstract algebraic approach, where quotients became a central tool. The First Isomorphism Theorem, which connects kernels and images via quotients, appeared first in group theory and was later adapted to vector spaces and modules. These constructions are now fundamental in representation theory, homological algebra, and functional analysis.

1. Direct Sums

A direct sum decomposes a vector space into "independent" pieces. Just as integers factor into primes, vector spaces can sometimes be written as direct sums of simpler subspaces. This decomposition is fundamental for understanding linear operators.

Definition 1.1: Sum of Subspaces

Let U,WU, W be subspaces of VV. The sum is:

U+W={u+w:uU,wW}U + W = \{u + w : u \in U, w \in W\}

This is the smallest subspace containing both UU and WW.

Definition 1.2: Internal Direct Sum

We say VV is the internal direct sum of subspaces UU and WW, written:

V=UWV = U \oplus W

if both conditions hold:

  1. V=U+WV = U + W (sum spans VV)
  2. UW={0}U \cap W = \{0\} (trivial intersection)
Remark 1.1: Intuition

Direct sum means:

  • Completeness: Every vector can be written as a sum from UU and WW
  • Uniqueness: There's only one way to do this decomposition

The condition UW={0}U \cap W = \{0\} prevents ambiguity.

Theorem 1.1: Direct Sum Characterization

V=UWV = U \oplus W if and only if every vVv \in V can be written uniquely as:

v=u+wwith uU,wWv = u + w \quad \text{with } u \in U, w \in W
Proof of Theorem 1.1:

(⇒) Suppose V=UWV = U \oplus W.

Existence: Since U+W=VU + W = V, every vv equals some u+wu + w.

Uniqueness: If v=u1+w1=u2+w2v = u_1 + w_1 = u_2 + w_2, then:

u1u2=w2w1UW={0}u_1 - u_2 = w_2 - w_1 \in U \cap W = \{0\}

So u1=u2u_1 = u_2 and w1=w2w_1 = w_2.

(⇐) If decomposition is always unique:

Sum: Every vv is a sum, so U+W=VU + W = V.

Intersection: If xUWx \in U \cap W, then x=x+0=0+xx = x + 0 = 0 + x are two decompositions. By uniqueness, x=0x = 0.

Example 1.1: Standard Decomposition of ℝ²

In R2\mathbb{R}^2, let:

  • U={(x,0):xR}U = \{(x, 0) : x \in \mathbb{R}\} (x-axis)
  • W={(0,y):yR}W = \{(0, y) : y \in \mathbb{R}\} (y-axis)

Then R2=UW\mathbb{R}^2 = U \oplus W because:

  • Every (a,b)=(a,0)+(0,b)(a, b) = (a, 0) + (0, b)
  • UW={(0,0)}U \cap W = \{(0, 0)\}

Uniqueness: (a,b)(a, b) has only one decomposition as x-component + y-component.

Example 1.2: Non-Direct Sum

In R2\mathbb{R}^2, let U={(x,0)}U = \{(x, 0)\} and W={(x,x)}W = \{(x, x)\} (line y=xy = x).

Then U+W=R2U + W = \mathbb{R}^2, but is this direct?

Check intersection: (x,0)=(y,y)(x, 0) = (y, y) requires x=yx = y and 0=y0 = y, so y=0y = 0.

UW={(0,0)}U \cap W = \{(0, 0)\}. Yes, it's a direct sum!

Definition 1.3: Complement

A subspace WW is a complement of UU in VV if V=UWV = U \oplus W.

We also say UU and WW are complementary subspaces.

Theorem 1.2: Existence of Complements

Every subspace UU of a finite-dimensional space VV has a complement.

Proof of Theorem 1.2:

Let {u1,,uk}\{u_1, \ldots, u_k\} be a basis of UU.

Extend to a basis {u1,,uk,w1,,wm}\{u_1, \ldots, u_k, w_1, \ldots, w_m\} of VV.

Let W=span{w1,,wm}W = \text{span}\{w_1, \ldots, w_m\}.

Claim: V=UWV = U \oplus W.

Sum: Any vv is a combination of all basis vectors, so v=u+wv = u + w with uUu \in U, wWw \in W.

Intersection: If vUWv \in U \cap W, then vv is a combination of uiu_i's and of wjw_j's. By independence of the full basis, all coefficients are 0.

Remark 1.2: Non-Uniqueness of Complements

Warning: Complements are NOT unique! In R2\mathbb{R}^2, the x-axis has infinitely many complements: any non-horizontal line through the origin.

Example 1.3: Multiple Complements

In R2\mathbb{R}^2, let U={(x,0)}U = \{(x, 0)\}. Complements of UU include:

  • W1={(0,y)}W_1 = \{(0, y)\} (y-axis)
  • W2={(t,t)}W_2 = \{(t, t)\} (line y=xy = x)
  • W3={(t,2t)}W_3 = \{(t, 2t)\} (line y=2xy = 2x)
  • Any line through origin except the x-axis itself
Theorem 1.3: Dimension of Direct Sum

If V=UWV = U \oplus W, then:

dim(V)=dim(U)+dim(W)\dim(V) = \dim(U) + \dim(W)
Proof of Theorem 1.3:

By the dimension formula for sums:

dim(U+W)=dim(U)+dim(W)dim(UW)\dim(U + W) = \dim(U) + \dim(W) - \dim(U \cap W)

Since UW={0}U \cap W = \{0\}, we have dim(UW)=0\dim(U \cap W) = 0.

Thus dim(V)=dim(U)+dim(W)\dim(V) = \dim(U) + \dim(W).

Definition 1.4: External Direct Sum

Given vector spaces V1,V2V_1, V_2 over the same field (not necessarily subspaces of a common space), the external direct sum is:

V1V2={(v1,v2):v1V1,v2V2}V_1 \oplus V_2 = \{(v_1, v_2) : v_1 \in V_1, v_2 \in V_2\}

with operations (v1,v2)+(u1,u2)=(v1+u1,v2+u2)(v_1, v_2) + (u_1, u_2) = (v_1 + u_1, v_2 + u_2) and α(v1,v2)=(αv1,αv2)\alpha(v_1, v_2) = (\alpha v_1, \alpha v_2).

Remark 1.3: Internal vs External

Internal: Subspaces of the same space, combined to give the whole space.
External: Separate spaces, combined into a larger product space.

When V=UWV = U \oplus W (internal), the map (u,w)u+w(u, w) \mapsto u + w is an isomorphism UWVU \oplus W \to V (external to internal).

Example 1.4: External Direct Sum

R2R3\mathbb{R}^2 \oplus \mathbb{R}^3 consists of pairs ((x,y),(a,b,c))((x, y), (a, b, c)).

Dimension: dim(R2R3)=2+3=5\dim(\mathbb{R}^2 \oplus \mathbb{R}^3) = 2 + 3 = 5.

This is isomorphic to R5\mathbb{R}^5 via ((x,y),(a,b,c))(x,y,a,b,c)((x, y), (a, b, c)) \mapsto (x, y, a, b, c).

Definition 1.5: Projection

If V=UWV = U \oplus W, the projection onto U along W is:

πU:VU,πU(u+w)=u\pi_U: V \to U, \quad \pi_U(u + w) = u

where each vVv \in V is uniquely written as v=u+wv = u + w.

Theorem 1.4: Properties of Projections

If π\pi is the projection onto UU along WW where V=UWV = U \oplus W:

  1. π\pi is linear
  2. π2=π\pi^2 = \pi (idempotent)
  3. ker(π)=W\ker(\pi) = W
  4. im(π)=U\text{im}(\pi) = U
  5. IπI - \pi is the projection onto WW along UU
Example 1.5: Projection in ℝ²

In R2=(x-axis)(y-axis)\mathbb{R}^2 = (\text{x-axis}) \oplus (\text{y-axis}):

Projection onto x-axis: πx(a,b)=(a,0)\pi_x(a, b) = (a, 0)

Projection onto y-axis: πy(a,b)=(0,b)\pi_y(a, b) = (0, b)

Note: πx+πy=I\pi_x + \pi_y = I and πxπy=0\pi_x \circ \pi_y = 0.

Definition 1.6: Multiple Direct Sums

More generally, V=U1U2UkV = U_1 \oplus U_2 \oplus \cdots \oplus U_k means:

  1. V=U1+U2++UkV = U_1 + U_2 + \cdots + U_k
  2. For each ii: Ui(U1++Ui1+Ui+1++Uk)={0}U_i \cap (U_1 + \cdots + U_{i-1} + U_{i+1} + \cdots + U_k) = \{0\}

Equivalently: every vv has a unique representation v=u1++ukv = u_1 + \cdots + u_k.

Theorem 1.5: Multiple Direct Sum Dimension

If V=U1UkV = U_1 \oplus \cdots \oplus U_k, then:

dim(V)=dim(U1)++dim(Uk)\dim(V) = \dim(U_1) + \cdots + \dim(U_k)
Example 1.6: Direct Sum of Eigenspaces

Let T:R3R3T: \mathbb{R}^3 \to \mathbb{R}^3 have eigenvalues λ1=1,λ2=2,λ3=3\lambda_1 = 1, \lambda_2 = 2, \lambda_3 = 3 with 1-dimensional eigenspaces.

Then: R3=E1E2E3\mathbb{R}^3 = E_1 \oplus E_2 \oplus E_3

This decomposition is the key to diagonalization!

Corollary 1.1: Projection Decomposition

If V=U1UkV = U_1 \oplus \cdots \oplus U_k with projections πi\pi_i onto UiU_i:

  • π1++πk=I\pi_1 + \cdots + \pi_k = I (identity)
  • πiπj=0\pi_i \circ \pi_j = 0 for iji \neq j
  • πi2=πi\pi_i^2 = \pi_i (idempotent)
Remark 1.4: Direct Sum and Basis

If V=U1UkV = U_1 \oplus \cdots \oplus U_k and Bi\mathcal{B}_i is a basis for UiU_i, then:

B=B1B2Bk\mathcal{B} = \mathcal{B}_1 \cup \mathcal{B}_2 \cup \cdots \cup \mathcal{B}_k

is a basis for VV. The bases "add up" nicely.

Example 1.7: Symmetric and Skew-Symmetric Decomposition

For Mn(R)M_n(\mathbb{R}) (n×n real matrices):

Mn=SymnSkewnM_n = \text{Sym}_n \oplus \text{Skew}_n

Any matrix AA uniquely decomposes as:

A=A+AT2+AAT2A = \frac{A + A^T}{2} + \frac{A - A^T}{2}

symmetric part + skew-symmetric part.

Dimension check: n2=n(n+1)2+n(n1)2n^2 = \frac{n(n+1)}{2} + \frac{n(n-1)}{2}

2. Quotient Spaces

A quotient space "collapses" a subspace to a point, treating vectors that differ by an element of the subspace as equivalent. This construction is fundamental for the First Isomorphism Theorem and for understanding the structure of linear maps.

Definition 2.1: Coset

Let WW be a subspace of VV and vVv \in V. The coset of vv modulo WW is:

v+W={v+w:wW}v + W = \{v + w : w \in W\}

This is a "shifted copy" of WW passing through vv.

Remark 2.1: Geometric Intuition

If WW is a line through the origin, cosets are parallel lines:

  • The coset 0+W=W0 + W = W is the original line
  • The coset v+Wv + W is the line parallel to WW passing through vv

All points on the same parallel line are "equivalent" in the quotient.

Definition 2.2: Equivalence Relation

Define vuv \sim u if and only if vuWv - u \in W.

This is an equivalence relation:

  • Reflexive: vv=0Wv - v = 0 \in W
  • Symmetric: vuW    uv=(vu)Wv - u \in W \implies u - v = -(v - u) \in W
  • Transitive: vuWv - u \in W and uwW    vw=(vu)+(uw)Wu - w \in W \implies v - w = (v - u) + (u - w) \in W
Definition 2.3: Quotient Space

The quotient space V/WV/W is the set of all cosets:

V/W={v+W:vV}V/W = \{v + W : v \in V\}

with vector space operations:

(v+W)+(u+W)=(v+u)+W(v + W) + (u + W) = (v + u) + W
α(v+W)=(αv)+W\alpha(v + W) = (\alpha v) + W
Theorem 2.1: Well-Definedness

The operations on V/WV/W are well-defined: they don't depend on the choice of coset representatives.

Proof of Theorem 2.1:

Suppose v+W=v+Wv + W = v' + W and u+W=u+Wu + W = u' + W.

Then vvWv - v' \in W and uuWu - u' \in W.

Addition: (v+u)(v+u)=(vv)+(uu)W(v + u) - (v' + u') = (v - v') + (u - u') \in W.

So (v+u)+W=(v+u)+W(v + u) + W = (v' + u') + W. ✓

Scalar multiplication: αvαv=α(vv)W\alpha v - \alpha v' = \alpha(v - v') \in W.

So αv+W=αv+W\alpha v + W = \alpha v' + W. ✓

Theorem 2.2: Quotient is a Vector Space

V/WV/W is a vector space over FF with:

  • Zero element: 0+W=W0 + W = W
  • Additive inverse: (v+W)=(v)+W-(v + W) = (-v) + W
Theorem 2.3: Quotient Space Dimension

If VV is finite-dimensional:

dim(V/W)=dim(V)dim(W)\dim(V/W) = \dim(V) - \dim(W)
Proof of Theorem 2.3:

Let {w1,,wk}\{w_1, \ldots, w_k\} be a basis of WW.

Extend to a basis {w1,,wk,v1,,vm}\{w_1, \ldots, w_k, v_1, \ldots, v_m\} of VV.

Claim: {v1+W,,vm+W}\{v_1 + W, \ldots, v_m + W\} is a basis of V/WV/W.

Spanning: Any v=αiwi+βjvjv = \sum \alpha_i w_i + \sum \beta_j v_j. Then:

v+W=βjvj+W=βj(vj+W)v + W = \sum \beta_j v_j + W = \sum \beta_j (v_j + W)

Independence: If βj(vj+W)=W\sum \beta_j (v_j + W) = W, then βjvjW\sum \beta_j v_j \in W.

So βjvj=αiwi\sum \beta_j v_j = \sum \alpha_i w_i for some αi\alpha_i.

By independence of the full basis, all βj=0\beta_j = 0.

Thus dim(V/W)=m=nk=dim(V)dim(W)\dim(V/W) = m = n - k = \dim(V) - \dim(W).

Example 2.1: ℝ² / Line

Let W={(t,0):tR}W = \{(t, 0) : t \in \mathbb{R}\} (x-axis) in R2\mathbb{R}^2.

Cosets: (a,b)+W={(a+t,b):tR}(a, b) + W = \{(a + t, b) : t \in \mathbb{R}\} = horizontal line at height bb.

Key insight: Two points are equivalent iff they have the same y-coordinate.

Dimension: dim(R2/W)=21=1\dim(\mathbb{R}^2/W) = 2 - 1 = 1.

The quotient R2/WR\mathbb{R}^2/W \cong \mathbb{R} via (a,b)+Wb(a, b) + W \mapsto b.

Example 2.2: ℝ³ / Plane

Let W={(x,y,0):x,yR}W = \{(x, y, 0) : x, y \in \mathbb{R}\} (xy-plane).

Cosets are horizontal planes at different heights:

(a,b,c)+W={(x,y,c):x,yR}(a, b, c) + W = \{(x, y, c) : x, y \in \mathbb{R}\}

Points are equivalent iff they have the same z-coordinate.

R3/WR\mathbb{R}^3/W \cong \mathbb{R} (1-dimensional).

Example 2.3: ℝ³ / Line

Let W=span{(1,0,0)}W = \text{span}\{(1, 0, 0)\} (x-axis).

Cosets are lines parallel to the x-axis:

(a,b,c)+W={(t,b,c):tR}(a, b, c) + W = \{(t, b, c) : t \in \mathbb{R}\}

Points are equivalent iff they have the same y and z coordinates.

R3/WR2\mathbb{R}^3/W \cong \mathbb{R}^2 (2-dimensional).

Definition 2.4: Canonical Projection

The canonical projection (or quotient map) is:

π:VV/W,π(v)=v+W\pi: V \to V/W, \quad \pi(v) = v + W

This map is linear, surjective, with ker(π)=W\ker(\pi) = W.

Theorem 2.4: First Isomorphism Theorem

If T:VUT: V \to U is a linear map, then:

V/ker(T)im(T)V / \ker(T) \cong \text{im}(T)

The isomorphism is Tˉ:v+ker(T)T(v)\bar{T}: v + \ker(T) \mapsto T(v).

Proof of Theorem 2.4:

Well-defined: If v+ker(T)=v+ker(T)v + \ker(T) = v' + \ker(T), then vvker(T)v - v' \in \ker(T).

So T(v)T(v)=T(vv)=0T(v) - T(v') = T(v - v') = 0, hence T(v)=T(v)T(v) = T(v'). ✓

Linear: Tˉ((v+W)+(u+W))=Tˉ((v+u)+W)=T(v+u)=T(v)+T(u)\bar{T}((v + W) + (u + W)) = \bar{T}((v + u) + W) = T(v + u) = T(v) + T(u). ✓

Injective: If Tˉ(v+W)=0\bar{T}(v + W) = 0, then T(v)=0T(v) = 0, so vker(T)v \in \ker(T).

Thus v+W=Wv + W = W, the zero in V/ker(T)V/\ker(T). ✓

Surjective: For any T(v)im(T)T(v) \in \text{im}(T), we have Tˉ(v+W)=T(v)\bar{T}(v + W) = T(v). ✓

Remark 2.2: Significance of First Isomorphism Theorem

This theorem says:

  • Every linear map "factors" through a quotient by its kernel
  • The dimension of the image equals dim(V) - dim(ker T) (rank-nullity!)
  • The quotient "measures" what's left after collapsing the kernel
Example 2.4: First Isomorphism Theorem Application

Let T:R3R2T: \mathbb{R}^3 \to \mathbb{R}^2 be T(x,y,z)=(x+y,y+z)T(x, y, z) = (x + y, y + z).

Kernel: x+y=0x + y = 0 and y+z=0y + z = 0, so ker(T)={t(1,1,1):tR}\ker(T) = \{t(-1, 1, -1) : t \in \mathbb{R}\}.

Image: TT is surjective onto R2\mathbb{R}^2 (check!).

By First Isomorphism Theorem: R3/ker(T)R2\mathbb{R}^3/\ker(T) \cong \mathbb{R}^2.

Dimension check: 31=23 - 1 = 2. ✓

Theorem 2.5: Quotient and Complement

If WW is a subspace of VV and UU is any complement of WW, then:

V/WUV/W \cong U
Proof of Theorem 2.5:

Define ϕ:UV/W\phi: U \to V/W by ϕ(u)=u+W\phi(u) = u + W.

Linear: Clear from definition.

Injective: If u+W=Wu + W = W, then uWu \in W. But uUu \in U and UW={0}U \cap W = \{0\}, so u=0u = 0.

Surjective: Any coset v+Wv + W can be written v=u+wv = u + w where uUu \in U, wWw \in W.

Then v+W=u+w+W=u+W=ϕ(u)v + W = u + w + W = u + W = \phi(u).

Remark 2.3: Connection: Direct Sums and Quotients

Direct sums and quotients are "dual" constructions:

  • Direct sum: If V=UWV = U \oplus W, then V/WUV/W \cong U
  • Quotient gives projection: The canonical projection π:VV/W\pi: V \to V/W has kernel WW
  • Complement gives section: The inclusion UVU \hookrightarrow V composed with π\pi gives isomorphism UV/WU \to V/W
Example 2.5: Quotient in Polynomial Space

Consider P3(R)P_3(\mathbb{R}) modulo W={p:p(0)=0}W = \{p : p(0) = 0\} (polynomials with zero constant term).

W=span{x,x2,x3}W = \text{span}\{x, x^2, x^3\}, so dim(W)=3\dim(W) = 3.

dim(P3/W)=43=1\dim(P_3/W) = 4 - 3 = 1.

Two polynomials p,qp, q are equivalent iff p(0)=q(0)p(0) = q(0).

The quotient P3/WRP_3/W \cong \mathbb{R} via p+Wp(0)p + W \mapsto p(0).

Example 2.6: Quotient by Solution Space

Let W={(x,y,z):x+y+z=0}W = \{(x, y, z) : x + y + z = 0\} (a plane in R3\mathbb{R}^3).

dim(W)=2\dim(W) = 2, so dim(R3/W)=1\dim(\mathbb{R}^3/W) = 1.

Points are equivalent iff they differ by a vector in WW.

Equivalently: (a,b,c)(a,b,c)(a, b, c) \sim (a', b', c') iff a+b+c=a+b+ca + b + c = a' + b' + c'.

So R3/WR\mathbb{R}^3/W \cong \mathbb{R} via (a,b,c)+Wa+b+c(a, b, c) + W \mapsto a + b + c.

3. Worked Examples

Example 3.1: Verifying a Direct Sum

Problem: Is R3=UW\mathbb{R}^3 = U \oplus W where U={(x,x,0)}U = \{(x, x, 0)\} and W={(0,y,z)}W = \{(0, y, z)\}?

Solution:

Sum: Any (a,b,c)=(a,a,0)+(0,ba,c)(a, b, c) = (a, a, 0) + (0, b - a, c)? Yes!

Intersection: (x,x,0)=(0,y,z)(x, x, 0) = (0, y, z) requires x=0,x=y,0=zx = 0, x = y, 0 = z.

So x=y=z=0x = y = z = 0, giving UW={0}U \cap W = \{0\}.

Answer: Yes, R3=UW\mathbb{R}^3 = U \oplus W.

Dimension check: dim(U)=1\dim(U) = 1, dim(W)=2\dim(W) = 2, 1+2=31 + 2 = 3. ✓

Example 3.2: Finding a Complement

Problem: Find a complement of U=span{(1,1,1)}U = \text{span}\{(1, 1, 1)\} in R3\mathbb{R}^3.

Solution: Need WW with dim(W)=31=2\dim(W) = 3 - 1 = 2 and UW={0}U \cap W = \{0\}.

Extend (1,1,1)(1, 1, 1) to a basis. Add (1,0,0)(1, 0, 0) (independent).

Add (0,1,0)(0, 1, 0) (independent from both).

Let W=span{(1,0,0),(0,1,0)}W = \text{span}\{(1, 0, 0), (0, 1, 0)\}.

Check: a(1,1,1)=b(1,0,0)+c(0,1,0)a(1, 1, 1) = b(1, 0, 0) + c(0, 1, 0) gives a=ba = b, a=ca = c, a=0a = 0.

So a=b=c=0a = b = c = 0. UW={0}U \cap W = \{0\}. ✓

Example 3.3: Computing in Quotient Space

Problem: In R3/W\mathbb{R}^3/W where W={(t,t,0)}W = \{(t, t, 0)\}, compute [(1,2,3)]+[(0,1,1)][(1, 2, 3)] + [(0, 1, 1)].

Solution:

[(1,2,3)]+[(0,1,1)]=[(1,2,3)+(0,1,1)]=[(1,3,4)][(1, 2, 3)] + [(0, 1, 1)] = [(1, 2, 3) + (0, 1, 1)] = [(1, 3, 4)]

What does [(1,3,4)][(1, 3, 4)] represent? All vectors differing from (1,3,4)(1, 3, 4) by an element of WW:

[(1,3,4)]={(1+t,3+t,4):tR}[(1, 3, 4)] = \{(1 + t, 3 + t, 4) : t \in \mathbb{R}\}
Example 3.4: Quotient by Kernel

Problem: Let T:P2P1T: P_2 \to P_1 be T(p)=p(x)T(p) = p'(x) (derivative).

Describe P2/ker(T)P_2/\ker(T).

Solution:

Kernel: p(x)=0p'(x) = 0 means pp is constant. So ker(T)=P0\ker(T) = P_0 (constants).

Dimension: dim(P2/ker(T))=31=2\dim(P_2/\ker(T)) = 3 - 1 = 2.

Image: im(T)=P1\text{im}(T) = P_1 (all derivatives of quadratics are linear).

By First Isomorphism Theorem: P2/P0P1P_2/P_0 \cong P_1.

Interpretation: Two polynomials are equivalent iff they differ by a constant.

Example 3.5: Multiple Direct Sums

Problem: Verify R3=XYZ\mathbb{R}^3 = X \oplus Y \oplus Z where XX, YY, ZZ are the coordinate axes.

Solution:

  • X={(a,0,0)}X = \{(a, 0, 0)\}, Y={(0,b,0)}Y = \{(0, b, 0)\}, Z={(0,0,c)}Z = \{(0, 0, c)\}
  • Sum: (a,b,c)=(a,0,0)+(0,b,0)+(0,0,c)(a, b, c) = (a, 0, 0) + (0, b, 0) + (0, 0, c)
  • Intersections: XY=XZ=YZ={0}X \cap Y = X \cap Z = Y \cap Z = \{0\}
  • Also: X(Y+Z)={0}X \cap (Y + Z) = \{0\}

Dimension check: 1+1+1=31 + 1 + 1 = 3. ✓

Example 3.6: Projection Matrix

Problem: Find the projection matrix onto U=span{(1,1)}U = \text{span}\{(1, 1)\} along W=span{(1,1)}W = \text{span}\{(1, -1)\}.

Solution: Any (a,b)=α(1,1)+β(1,1)(a, b) = \alpha(1, 1) + \beta(1, -1).

Solving: α=a+b2\alpha = \frac{a + b}{2}, β=ab2\beta = \frac{a - b}{2}.

Projection onto UU: π(a,b)=α(1,1)=a+b2(1,1)\pi(a, b) = \alpha(1, 1) = \frac{a + b}{2}(1, 1).

P=12(1111)P = \frac{1}{2}\begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix}

Check: P2=PP^2 = P

Example 3.7: Basis of Quotient Space

Problem: Find a basis for R4/W\mathbb{R}^4/W where W=span{(1,0,1,0),(0,1,0,1)}W = \text{span}\{(1, 0, 1, 0), (0, 1, 0, 1)\}.

Solution:

dim(R4/W)=42=2\dim(\mathbb{R}^4/W) = 4 - 2 = 2.

Extend W's basis: add e1=(1,0,0,0)e_1 = (1, 0, 0, 0) and e2=(0,1,0,0)e_2 = (0, 1, 0, 0).

Basis of quotient: {e1+W,e2+W}\{e_1 + W, e_2 + W\}.

Any v=ae1+be2+ce3+de4v = ae_1 + be_2 + ce_3 + de_4 has v+W=a(e1+W)+b(e2+W)v + W = a(e_1 + W) + b(e_2 + W).

Example 3.8: Non-Direct Sum

Problem: Show R2=U+W\mathbb{R}^2 = U + W but R2UW\mathbb{R}^2 \neq U \oplus W where U=W={(x,x)}U = W = \{(x, x)\}.

Solution:

Sum: U+W={(x,x)+(y,y)}={(z,z)}=UR2U + W = \{(x, x) + (y, y)\} = \{(z, z)\} = U \neq \mathbb{R}^2.

Actually, U+W=UU + W = U, so it doesn't even span!

Also UW=U{0}U \cap W = U \neq \{0\}.

Conclusion: Not a direct sum (fails both conditions).

Example 3.9: Quotient of Matrices

Problem: Describe M2(R)/Sym2M_2(\mathbb{R}) / \text{Sym}_2.

Solution:

dim(M2)=4\dim(M_2) = 4, dim(Sym2)=3\dim(\text{Sym}_2) = 3.

dim(M2/Sym2)=43=1\dim(M_2/\text{Sym}_2) = 4 - 3 = 1.

Two matrices are equivalent iff they have the same skew-symmetric part.

Isomorphism: M2/Sym2Skew2RM_2/\text{Sym}_2 \cong \text{Skew}_2 \cong \mathbb{R}.

Example 3.10: Double Quotient

Problem: Let W1W2VW_1 \subseteq W_2 \subseteq V. What is (V/W1)/(W2/W1)(V/W_1)/(W_2/W_1)?

Solution: By the Third Isomorphism Theorem:

(V/W1)/(W2/W1)V/W2(V/W_1)/(W_2/W_1) \cong V/W_2

Example: V=R4V = \mathbb{R}^4, W1=span{e1}W_1 = \text{span}\{e_1\}, W2=span{e1,e2}W_2 = \text{span}\{e_1, e_2\}.

(R4/R)/(R2/R)R3/RR2(\mathbb{R}^4/\mathbb{R})/(\mathbb{R}^2/\mathbb{R}) \cong \mathbb{R}^3/\mathbb{R} \cong \mathbb{R}^2.

Example 3.11: Direct Sum in Function Spaces

Problem: Show that continuous functions decompose as even + odd.

Solution: Define:

  • E={f:f(x)=f(x)}E = \{f : f(-x) = f(x)\} (even functions)
  • O={f:f(x)=f(x)}O = \{f : f(-x) = -f(x)\} (odd functions)

Any ff decomposes as:

f(x)=f(x)+f(x)2+f(x)f(x)2f(x) = \frac{f(x) + f(-x)}{2} + \frac{f(x) - f(-x)}{2}

The first part is even, the second is odd.

Intersection: If ff is both even and odd, then f(x)=f(x)=f(x)f(x) = f(-x) = -f(x), so f=0f = 0.

Example 3.12: Using Dimension to Prove Direct Sum

Problem: In R4\mathbb{R}^4, let U=span{(1,0,1,0),(0,1,0,1)}U = \text{span}\{(1,0,1,0), (0,1,0,1)\} and W=span{(1,1,0,0),(0,0,1,1)}W = \text{span}\{(1,1,0,0), (0,0,1,1)\}. Is R4=UW\mathbb{R}^4 = U \oplus W?

Solution:

dim(U)=2\dim(U) = 2, dim(W)=2\dim(W) = 2.

If UW={0}U \cap W = \{0\}, then dim(U+W)=2+2=4=dim(R4)\dim(U + W) = 2 + 2 = 4 = \dim(\mathbb{R}^4).

Check intersection: Solve a(1,0,1,0)+b(0,1,0,1)=c(1,1,0,0)+d(0,0,1,1)a(1,0,1,0) + b(0,1,0,1) = c(1,1,0,0) + d(0,0,1,1):

  • a=ca = c, b=cb = c, a=da = d, b=db = d
  • So a=b=c=da = b = c = d. Check: a(1,0,1,0)+a(0,1,0,1)=(a,a,a,a)a(1,0,1,0) + a(0,1,0,1) = (a,a,a,a) and a(1,1,0,0)+a(0,0,1,1)=(a,a,a,a)a(1,1,0,0) + a(0,0,1,1) = (a,a,a,a)

UW=span{(1,1,1,1)}{0}U \cap W = \text{span}\{(1,1,1,1)\} \neq \{0\}.

Answer: No, R4UW\mathbb{R}^4 \neq U \oplus W.

4. Common Mistakes

Mistake 1: Thinking sum equals direct sum

U+WU + W is NOT the same as UWU \oplus W! The sum always exists, but direct sum requires UW={0}U \cap W = \{0\}.

Mistake 2: Confusing cosets with subspaces

A coset v+Wv + W (with v0v \neq 0) is NOT a subspace! It doesn't contain 0 (unless vWv \in W). Cosets are "shifted" subspaces.

Mistake 3: Assuming complement is unique

Complements are NOT unique! In R2\mathbb{R}^2, any line (except the x-axis) complements the x-axis. There are infinitely many choices.

Mistake 4: Wrong quotient dimension

dim(V/W)=dim(V)dim(W)\dim(V/W) = \dim(V) - \dim(W), NOT dim(W)\dim(W)! The quotient "removes" the dimension of WW.

Mistake 5: Forgetting well-definedness check

When defining operations on quotients, you must verify the result doesn't depend on the choice of representative. This is essential!

Mistake 6: Confusing V/W with a complement

V/WV/W is NOT a subspace of VV! It's a different vector space whose elements are cosets. However, V/WUV/W \cong U for any complement UU.

Mistake 7: Assuming U ⊕ W implies W ⊕ U

Direct sum IS commutative: UW=WUU \oplus W = W \oplus U (both equal VV). But be careful with external direct sum ordering when identifying with FnF^n.

Mistake 8: Not checking pairwise intersections for multiple sums

For UVWU \oplus V \oplus W, it's NOT enough that UV=UW=VW={0}U \cap V = U \cap W = V \cap W = \{0\}. You also need U(V+W)={0}U \cap (V + W) = \{0\}, etc.

✓ Verification Checklist

For Direct Sum V = U ⊕ W:

  • Check U + W = V (every v = u + w)
  • Check U ∩ W = {0} (only overlap is zero)
  • Use dimension: dim(U) + dim(W) = dim(V)

For Quotient V/W:

  • Identify W (what gets "collapsed")
  • Understand cosets as parallel copies of W
  • Check dim(V/W) = dim(V) - dim(W)

For Well-Definedness:

  • If [v] = [v'], does the result depend only on [v]?
  • Show different representatives give same answer

5. Key Takeaways

Direct Sum = Unique Decomposition

V=UWV = U \oplus W means every vector has a unique representation as u+wu + w.

Every Subspace Has Complements

But complements are not unique—there are typically infinitely many.

Quotient = Collapsing W

V/WV/W treats vectors differing by elements of WW as identical.

First Isomorphism Theorem

V/ker(T)im(T)V/\ker(T) \cong \text{im}(T)—the quotient "measures" the image.

Dimension Formulas

dim(UW)=dimU+dimW\dim(U \oplus W) = \dim U + \dim W, dim(V/W)=dimVdimW\dim(V/W) = \dim V - \dim W

Projections from Direct Sums

V=UWV = U \oplus W gives projections πU\pi_U, πW\pi_W with πU+πW=I\pi_U + \pi_W = I.

Quick Reference

ConceptNotationKey Property
Sum of subspacesU + WSmallest subspace containing both
Direct sumU ⊕ WSum with U ∩ W = {0}
Cosetv + WShifted copy of W
Quotient spaceV/WSet of all cosets
Canonical projectionπ: V → V/Wπ(v) = v + W, ker(π) = W

6. Applications

Eigenspace Decomposition

A diagonalizable operator decomposes VV as a direct sum of eigenspaces:V=Eλ1EλkV = E_{\lambda_1} \oplus \cdots \oplus E_{\lambda_k}.

Rank-Nullity Theorem

The First Isomorphism Theorem gives:dimV=dim(kerT)+dim(im T)\dim V = \dim(\ker T) + \dim(\text{im } T).

Signal Processing

Signals decompose into frequency components: direct sum of eigenspaces of shift operators (Fourier decomposition).

Homological Algebra

Quotients define cohomology groups:Hn=ker(dn)/im(dn1)H^n = \ker(d^n)/\text{im}(d^{n-1}).

Example 6.1: Jordan Decomposition Preview

Every linear operator on a complex vector space decomposes as:

T=D+NT = D + N

where DD is diagonalizable, NN is nilpotent, and DN=NDDN = ND.

This requires understanding generalized eigenspaces and their direct sum structure.

Example 6.2: Quotient in Function Spaces

Consider continuous functions C[0,1]C[0, 1] modulo constants:

C[0,1]/RC[0, 1] / \mathbb{R}

Two functions are equivalent iff they differ by a constant.

This is useful when we only care about "shape" of functions, not vertical shift.

Remark 6.1: Why These Constructions Matter

Direct sums and quotients are the two fundamental ways to build/decompose vector spaces:

  • Direct sum: Combines independent pieces (like Cartesian product)
  • Quotient: Collapses redundancy (like modular arithmetic)

Together, they form the foundation for structural theorems in linear algebra.

Example 6.3: Primary Decomposition

If TT has characteristic polynomial p(x)=(xλ1)n1(xλk)nkp(x) = (x - \lambda_1)^{n_1} \cdots (x - \lambda_k)^{n_k}, then:

V=ker(Tλ1I)n1ker(TλkI)nkV = \ker(T - \lambda_1 I)^{n_1} \oplus \cdots \oplus \ker(T - \lambda_k I)^{n_k}

These are the generalized eigenspaces. This decomposition is key to Jordan normal form.

Example 6.4: Quotient in Affine Geometry

Affine space can be viewed as a quotient:

AnRn+1/(scaling)\mathbb{A}^n \cong \mathbb{R}^{n+1} / \text{(scaling)}

Points in projective space are lines through origin in Rn+1\mathbb{R}^{n+1}.

Homogeneous coordinates: [x0:x1::xn][x_0 : x_1 : \cdots : x_n] where [λx0::λxn]=[x0::xn][\lambda x_0 : \cdots : \lambda x_n] = [x_0 : \cdots : x_n].

Applications Summary Table

AreaConstructionApplication
Spectral TheoryDirect sum of eigenspacesDiagonalization
Linear MapsV/ker(T)Rank-nullity theorem
Signal ProcessingFrequency decompositionFourier analysis
Matrix TheorySymmetric ⊕ SkewMatrix classification
TopologyCohomology = ker/imInvariants of spaces

7. Chapter Summary

Direct Sums

  • V=UWV = U \oplus WV=U+WV = U + W and UW={0}U \cap W = \{0\}
  • Equivalent to unique decomposition: v=u+wv = u + w
  • Dimension adds: dim(UW)=dimU+dimW\dim(U \oplus W) = \dim U + \dim W
  • Every subspace has complements (not unique)
  • Direct sums give projections: πU+πW=I\pi_U + \pi_W = I

Quotient Spaces

  • V/WV/W = cosets v+Wv + W, where vuv \sim u iff vuWv - u \in W
  • Operations well-defined: [v]+[u]=[v+u][v] + [u] = [v + u], α[v]=[αv]\alpha[v] = [\alpha v]
  • Dimension subtracts: dim(V/W)=dimVdimW\dim(V/W) = \dim V - \dim W
  • First Isomorphism: V/kerTim TV/\ker T \cong \text{im } T

Key Connections

  • V/WUV/W \cong U for any complement UU of WW
  • Rank-nullity follows from First Isomorphism Theorem
  • Projections ↔ direct sum decompositions
Remark 7.1: Completing Part II: Vector Spaces

This concludes Part II. We've built the foundation: vector spaces, subspaces, linear independence, basis, dimension, direct sums, and quotients. In Part III: Linear Mappings, we'll study functions between vector spaces that preserve the structure—these connect everything together.

Part II: Vector Spaces Complete!

You've learned:

  1. Vector Space Definition: Axioms and fundamental examples
  2. Subspaces: Subspace criterion, span, sum and intersection
  3. Linear Independence: Definition, tests, Steinitz exchange
  4. Basis & Dimension: Coordinates, isomorphism to Fⁿ
  5. Direct Sums & Quotients: Decomposition and collapsing

Next: Part III - Linear Mappings will show how these structures interact!

8. Quick Reference

Checking Direct Sum

  1. Verify U+W=VU + W = V (spans)
  2. Verify UW={0}U \cap W = \{0\}
  3. OR: dim(U) + dim(W) = dim(V) and spans

Finding a Complement

  1. Take basis of W
  2. Extend to basis of V
  3. Span of new vectors = complement

Quotient Operations

  • [v]+[u]=[v+u][v] + [u] = [v + u]
  • α[v]=[αv]\alpha[v] = [\alpha v]
  • • Zero: [0]=W[0] = W
  • [v]=[u][v] = [u] iff vuWv - u \in W

Key Dimension Formulas

  • dim(UW)=dimU+dimW\dim(U \oplus W) = \dim U + \dim W
  • dim(V/W)=dimVdimW\dim(V/W) = \dim V - \dim W
  • dim(U+W)=dimU+dimWdim(UW)\dim(U + W) = \dim U + \dim W - \dim(U \cap W)

Algorithm: Computing Projection

Input: Direct sum V=UWV = U \oplus W, vector vVv \in V

  1. Choose bases BU\mathcal{B}_U and BW\mathcal{B}_W
  2. Express vv in combined basis: v=u+wv = u + w
  3. Output: πU(v)=u\pi_U(v) = u, πW(v)=w\pi_W(v) = w

Algorithm: Basis of Quotient Space

Input: Subspace WW of VV

  1. Find basis {w1,,wk}\{w_1, \ldots, w_k\} of WW
  2. Extend to basis {w1,,wk,v1,,vm}\{w_1, \ldots, w_k, v_1, \ldots, v_m\} of VV
  3. Output: {v1+W,,vm+W}\{v_1 + W, \ldots, v_m + W\} is a basis of V/WV/W
Remark 8.1: What Comes Next

In Part III, you'll learn:

  • Linear maps: Structure-preserving functions between vector spaces
  • Kernel and image: Measuring injectivity and surjectivity
  • Rank-nullity theorem: The fundamental dimension equation
  • Isomorphisms: When two spaces are "the same"
  • Matrix representation: Coordinates for linear maps

Conceptual Summary: Building vs Collapsing

Think of vector spaces like LEGO structures:

Direct Sum (Building)

  • Combines independent pieces
  • Like stacking LEGO blocks
  • Total = sum of parts
  • Each piece is recoverable (projection)

Quotient (Collapsing)

  • Identifies points that differ by W
  • Like crushing a dimension flat
  • Total = original minus collapsed
  • Loses information (only equivalence classes remain)

Summary of Key Results

TheoremStatement
Direct Sum CharacterizationV=UWV = U \oplus W ⟺ unique decomposition
Existence of ComplementsEvery subspace has a complement
Direct Sum Dimensiondim(UW)=dimU+dimW\dim(U \oplus W) = \dim U + \dim W
Quotient Dimensiondim(V/W)=dimVdimW\dim(V/W) = \dim V - \dim W
First IsomorphismV/kerTim TV/\ker T \cong \text{im } T
Quotient ≅ ComplementV/WUV/W \cong U for any complement UU
Direct Sums & Quotients Practice
12
Questions
0
Correct
0%
Accuracy
1
If V=UWV = U \oplus W, what is UWU \cap W?
Easy
Not attempted
2
If dim(V)=5\dim(V) = 5 and dim(W)=2\dim(W) = 2 where WVW \subseteq V, what is dim(V/W)\dim(V/W)?
Easy
Not attempted
3
In R3\mathbb{R}^3, let W=span{(1,0,0)}W = \text{span}\{(1,0,0)\}. What does V/WV/W look like?
Medium
Not attempted
4
If UU and WW are subspaces with V=U+WV = U + W and dim(UW)=0\dim(U \cap W) = 0, then:
Medium
Not attempted
5
Does every subspace have a complement?
Medium
Not attempted
6
What is [v]+[u][v] + [u] in V/WV/W?
Easy
Not attempted
7
If V=UWV = U \oplus W and dimU=3\dim U = 3, dimW=4\dim W = 4, what is dimV\dim V?
Easy
Not attempted
8
In R2\mathbb{R}^2, which pairs are complementary subspaces?
Hard
Not attempted
9
What is the zero element of V/WV/W?
Medium
Not attempted
10
If vuWv - u \in W, what does this mean in V/WV/W?
Medium
Not attempted
11
If V=U1U2U3V = U_1 \oplus U_2 \oplus U_3 with dimUi=2\dim U_i = 2, what is dimV\dim V?
Easy
Not attempted
12
What is the projection onto UU along WW if V=UWV = U \oplus W?
Hard
Not attempted

Frequently Asked Questions

What's the intuition for quotient spaces?

V/W 'collapses' W to a point. Elements of V/W are parallel copies of W. If W is a line through origin, V/W contains all lines parallel to W, each as a single 'point' in the quotient.

How is V/W different from a complement of W?

A complement U is a subspace of V with V = W ⊕ U. The quotient V/W is a different space altogether. But V/W ≅ U for any complement U.

Why do we need quotient spaces?

They're essential for the First Isomorphism Theorem: if T: V → W is linear, then V/ker(T) ≅ im(T). Quotients also appear in defining cosets in group theory and factor rings in algebra.

Is the complement of a subspace unique?

No! A subspace typically has infinitely many complements. For example, in ℝ², any line not equal to the x-axis is a complement of the x-axis.

What's the external direct sum?

Given spaces V and W (not necessarily subspaces of a common space), V ⊕ W is the set of pairs (v, w) with componentwise operations. Its dimension is dim(V) + dim(W).

How do I verify that V = U ⊕ W?

Check two things: (1) U + W = V (every vector is a sum), and (2) U ∩ W = {0} (only overlap is zero). Equivalently, show every v has a UNIQUE decomposition as u + w.

What is a projection?

If V = U ⊕ W, the projection onto U along W is the map π: V → U defined by π(u + w) = u. It satisfies π² = π (idempotent) and ker(π) = W, im(π) = U.

Why check well-definedness for quotient operations?

Operations on V/W are defined using representatives: [v] + [u] = [v + u]. We must verify this doesn't depend on which representatives we choose—different v', u' with [v'] = [v], [u'] = [u] must give [v' + u'] = [v + u].

How are direct sums and products related?

For finite families of spaces, the direct sum and direct product coincide. For infinite families, they differ: direct sum requires only finitely many non-zero components, while products allow all components to be non-zero.

What's the connection to eigenspace decomposition?

If a linear operator has enough eigenvectors, V decomposes as a direct sum of eigenspaces: V = E_{λ₁} ⊕ E_{λ₂} ⊕ ... This is the foundation of diagonalization.