MathIsimple
Course 3
Available

Basis & Dimension

A basis is a minimal spanning set and a maximal independent set. The number of elements in any basis is the dimension—the most important invariant of a vector space.

1. Basis

Definition 1.1: Basis

A basis of a vector space VV over a field FF is a set BV\mathcal{B} \subseteq V that satisfies:

  1. Linear independence: No vector in B\mathcal{B} is a linear combination of the others
  2. Spanning: span(B)=V\text{span}(\mathcal{B}) = V
Theorem 1.1: Basis Equivalences

For a finite set B={v1,,vn}\mathcal{B} = \{v_1, \ldots, v_n\} in a vector space VV, the following are equivalent:

  1. B\mathcal{B} is a basis for VV
  2. Every vVv \in V can be written uniquely as v=α1v1++αnvnv = \alpha_1 v_1 + \cdots + \alpha_n v_n
  3. B\mathcal{B} is a maximal linearly independent set
  4. B\mathcal{B} is a minimal spanning set
Example 1.1: Standard Basis of ℝⁿ

The standard basis of Rn\mathbb{R}^n is:

E={e1,e2,,en}\mathcal{E} = \{e_1, e_2, \ldots, e_n\}

where ei=(0,,0,1,0,,0)e_i = (0, \ldots, 0, 1, 0, \ldots, 0) has 1 in position ii and 0 elsewhere.

Example 1.2: Standard Basis of Polynomials

For Pn(F)P_n(F) = polynomials of degree ≤ nn, the standard basis is:

{1,x,x2,,xn}\{1, x, x^2, \ldots, x^n\}

This has n+1n + 1 elements.

Theorem 1.2: Existence of Basis

Every finite-dimensional vector space has a basis. Moreover, any linearly independent set can be extended to a basis, and any spanning set can be reduced to a basis.

Proof:

Start with any spanning set (which exists since the space is finite-dimensional). If it's not independent, remove redundant vectors until it is. The resulting set is a basis.

For extension: start with the independent set, and if it doesn't span, add vectors from a spanning set using the Steinitz Exchange Lemma.

Theorem 1.3: Uniqueness of Coefficients

If B={v1,,vn}\mathcal{B} = \{v_1, \ldots, v_n\} is a basis for VV, then every vector vVv \in V has a unique representation:

v=α1v1+α2v2++αnvnv = \alpha_1 v_1 + \alpha_2 v_2 + \cdots + \alpha_n v_n

The scalars α1,,αn\alpha_1, \ldots, \alpha_n are called the coordinates of vv with respect to B\mathcal{B}.

Proof:

Existence follows from spanning. Uniqueness: if v=αivi=βiviv = \sum \alpha_i v_i = \sum \beta_i v_i, then (αiβi)vi=0\sum (\alpha_i - \beta_i) v_i = 0. By independence, αi=βi\alpha_i = \beta_i for all ii.

2. Dimension

Theorem 2.1: Dimension Theorem

All bases of a finite-dimensional vector space VV have the same number of elements.

Proof of Theorem 2.1:

Let B1={v1,,vm}\mathcal{B}_1 = \{v_1, \ldots, v_m\} and B2={w1,,wn}\mathcal{B}_2 = \{w_1, \ldots, w_n\} be two bases.

Since B1\mathcal{B}_1 is independent and B2\mathcal{B}_2 spans, by the fundamental bound, mnm \leq n.

Since B2\mathcal{B}_2 is independent and B1\mathcal{B}_1 spans, nmn \leq m.

Therefore m=nm = n.

Definition 2.1: Dimension

The dimension of a finite-dimensional vector space VV, denoted dim(V)\dim(V), is the number of elements in any basis of VV.

If VV has no finite basis, we say VV is infinite-dimensional.

Example 2.1: Dimensions of Common Spaces
  • dim(Rn)=n\dim(\mathbb{R}^n) = n
  • dim(Pn(F))=n+1\dim(P_n(F)) = n + 1
  • dim(Mm×n(F))=mn\dim(M_{m \times n}(F)) = mn
  • dim({0})=0\dim(\{0\}) = 0 (the empty set is a basis)
  • dim(F[x])=\dim(F[x]) = \infty (infinite-dimensional)
Theorem 2.2: Dimension of Subspaces

If WW is a subspace of a finite-dimensional vector space VV, then:

  1. dim(W)dim(V)\dim(W) \leq \dim(V)
  2. dim(W)=dim(V)\dim(W) = \dim(V) if and only if W=VW = V
Theorem 2.3: Dimension of Sum of Subspaces

If UU and WW are subspaces of a finite-dimensional vector space VV, then:

dim(U+W)=dim(U)+dim(W)dim(UW)\dim(U + W) = \dim(U) + \dim(W) - \dim(U \cap W)
Proof:

Let {u1,,uk}\{u_1, \ldots, u_k\} be a basis for UWU \cap W. Extend it to bases {u1,,uk,v1,,vm}\{u_1, \ldots, u_k, v_1, \ldots, v_m\} for UU and {u1,,uk,w1,,wn}\{u_1, \ldots, u_k, w_1, \ldots, w_n\} for WW.

Then {u1,,uk,v1,,vm,w1,,wn}\{u_1, \ldots, u_k, v_1, \ldots, v_m, w_1, \ldots, w_n\} spans U+WU + W and is independent (since vectors from UU and WW can only combine to zero if they're in UWU \cap W).

So dim(U+W)=k+m+n=(k+m)+(k+n)k=dim(U)+dim(W)dim(UW)\dim(U + W) = k + m + n = (k + m) + (k + n) - k = \dim(U) + \dim(W) - \dim(U \cap W).

Corollary 2.1: Dimension of Direct Sum

If V=UWV = U \oplus W, then dim(V)=dim(U)+dim(W)\dim(V) = \dim(U) + \dim(W) (since UW={0}U \cap W = \{0\} has dimension 0).

3. Direct Sums

Definition 3.1: Sum of Subspaces

Let U,WU, W be subspaces of VV. The sum is:

U+W={u+w:uU,wW}U + W = \{u + w : u \in U, w \in W\}
Definition 3.2: Internal Direct Sum

We say VV is the internal direct sum of subspaces UU and WW, written:

V=UWV = U \oplus W

if both conditions hold:

  1. V=U+WV = U + W (sum spans VV)
  2. UW={0}U \cap W = \{0\} (trivial intersection)
Theorem 3.1: Direct Sum Characterization

V=UWV = U \oplus W if and only if every vVv \in V can be written uniquely as:

v=u+wwith uU,wWv = u + w \quad \text{with } u \in U, w \in W
Example 3.1: Standard Decomposition of ℝ²

In R2\mathbb{R}^2, let:

  • U={(x,0):xR}U = \{(x, 0) : x \in \mathbb{R}\} (x-axis)
  • W={(0,y):yR}W = \{(0, y) : y \in \mathbb{R}\} (y-axis)

Then R2=UW\mathbb{R}^2 = U \oplus W.

Theorem 3.2: Dimension of Direct Sum

If V=UWV = U \oplus W, then:

dim(V)=dim(U)+dim(W)\dim(V) = \dim(U) + \dim(W)

4. Quotient Spaces

Definition 4.1: Equivalence Relation

Define vuv \sim u if and only if vuWv - u \in W.

This is an equivalence relation:

  • Reflexive: vv=0Wv - v = 0 \in W
  • Symmetric: vuW    uv=(vu)Wv - u \in W \implies u - v = -(v - u) \in W
  • Transitive: vuWv - u \in W and uwW    vw=(vu)+(uw)Wu - w \in W \implies v - w = (v - u) + (u - w) \in W
Definition 4.2: Quotient Space

The quotient space V/WV/W is the set of all cosets:

V/W={v+W:vV}V/W = \{v + W : v \in V\}

with vector space operations:

(v+W)+(u+W)=(v+u)+W(v + W) + (u + W) = (v + u) + W
α(v+W)=(αv)+W\alpha(v + W) = (\alpha v) + W
Theorem 4.1: Well-Definedness

The operations on V/WV/W are well-defined: they don't depend on the choice of coset representatives.

Proof of Theorem 4.1:

Suppose v+W=v+Wv + W = v' + W and u+W=u+Wu + W = u' + W.

Then vvWv - v' \in W and uuWu - u' \in W.

Addition: (v+u)(v+u)=(vv)+(uu)W(v + u) - (v' + u') = (v - v') + (u - u') \in W.

So (v+u)+W=(v+u)+W(v + u) + W = (v' + u') + W. ✓

Scalar multiplication: αvαv=α(vv)W\alpha v - \alpha v' = \alpha(v - v') \in W.

So αv+W=αv+W\alpha v + W = \alpha v' + W. ✓

Theorem 4.2: Dimension of Quotient Space

If WW is a subspace of a finite-dimensional vector space VV, then:

dim(V/W)=dim(V)dim(W)\dim(V/W) = \dim(V) - \dim(W)
Proof:

Let {w1,,wk}\{w_1, \ldots, w_k\} be a basis for WW. Extend it to a basis {w1,,wk,v1,,vn}\{w_1, \ldots, w_k, v_1, \ldots, v_n\} for VV.

Then {[v1],,[vn]}\{[v_1], \ldots, [v_n]\} is a basis for V/WV/W:

  • Spanning: Any [v]V/W[v] \in V/W can be written v=αiwi+βjvjv = \sum \alpha_i w_i + \sum \beta_j v_j, so [v]=βj[vj][v] = \sum \beta_j [v_j].
  • Independence: If βj[vj]=[0]\sum \beta_j [v_j] = [0], then βjvjW\sum \beta_j v_j \in W, so βjvj=αiwi\sum \beta_j v_j = \sum \alpha_i w_i. By independence of the full basis, all βj=0\beta_j = 0.

Therefore, dim(V/W)=n=(k+n)k=dim(V)dim(W)\dim(V/W) = n = (k + n) - k = \dim(V) - \dim(W).

Example 4.1: Quotient Space Example

Let V=R3V = \mathbb{R}^3 and W={(x,y,0):x,yR}W = \{(x, y, 0) : x, y \in \mathbb{R}\} (the xy-plane).

Then V/WV/W consists of all planes parallel to the xy-plane. Each such plane is a single "point" in V/WV/W.

A basis for V/WV/W is {[(0,0,1)]}\{[(0, 0, 1)]\}, so dim(V/W)=1\dim(V/W) = 1.

Example 4.2: Quotient by Kernel

For a linear map T:VWT: V \to W, the quotient space V/ker(T)V/\ker(T) is isomorphic to im(T)\text{im}(T) (this is the First Isomorphism Theorem, to be covered later).

5. Coordinate Systems and Change of Basis

Once we fix a basis, every vector has a unique coordinate representation. Changing the basis changes the coordinates, and understanding this relationship is crucial for many applications.

Definition 5.1: Coordinate Vector

Let B={v1,,vn}\mathcal{B} = \{v_1, \ldots, v_n\} be a basis for VV. For v=i=1nαiviv = \sum_{i=1}^n \alpha_i v_i, the coordinate vector of vv with respect to B\mathcal{B} is:

[v]B=(α1α2αn)[v]_{\mathcal{B}} = \begin{pmatrix} \alpha_1 \\ \alpha_2 \\ \vdots \\ \alpha_n \end{pmatrix}
Example 5.1: Standard Coordinates

In R3\mathbb{R}^3 with standard basis E={e1,e2,e3}\mathcal{E} = \{e_1, e_2, e_3\}, the vector (2,3,5)(2, 3, 5) has coordinate vector:

[(2,3,5)]E=(235)[(2, 3, 5)]_{\mathcal{E}} = \begin{pmatrix} 2 \\ 3 \\ 5 \end{pmatrix}
Definition 5.2: Change of Basis Matrix

Let B={v1,,vn}\mathcal{B} = \{v_1, \ldots, v_n\} and C={w1,,wn}\mathcal{C} = \{w_1, \ldots, w_n\} be two bases for VV. The change of basis matrix from B\mathcal{B} to C\mathcal{C} is:

PBC=([v1]C[v2]C[vn]C)P_{\mathcal{B} \to \mathcal{C}} = \begin{pmatrix} | & | & & | \\ [v_1]_{\mathcal{C}} & [v_2]_{\mathcal{C}} & \cdots & [v_n]_{\mathcal{C}} \\ | & | & & | \end{pmatrix}

That is, the columns are the coordinate vectors of the B\mathcal{B} basis vectors with respect to C\mathcal{C}.

Theorem 5.1: Change of Coordinates Formula

For any vector vVv \in V:

[v]C=PBC[v]B[v]_{\mathcal{C}} = P_{\mathcal{B} \to \mathcal{C}} [v]_{\mathcal{B}}
Proof:

If v=αiviv = \sum \alpha_i v_i, then:

[v]C=αi[vi]C=PBC(α1αn)=PBC[v]B[v]_{\mathcal{C}} = \sum \alpha_i [v_i]_{\mathcal{C}} = P_{\mathcal{B} \to \mathcal{C}} \begin{pmatrix} \alpha_1 \\ \vdots \\ \alpha_n \end{pmatrix} = P_{\mathcal{B} \to \mathcal{C}} [v]_{\mathcal{B}}
Theorem 5.2: Properties of Change of Basis Matrix
  1. PBCP_{\mathcal{B} \to \mathcal{C}} is invertible
  2. PCB=(PBC)1P_{\mathcal{C} \to \mathcal{B}} = (P_{\mathcal{B} \to \mathcal{C}})^{-1}
  3. PBD=PCDPBCP_{\mathcal{B} \to \mathcal{D}} = P_{\mathcal{C} \to \mathcal{D}} P_{\mathcal{B} \to \mathcal{C}} (composition)
Example 5.2: Change of Basis in ℝ²

Let B={(1,0),(0,1)}\mathcal{B} = \{(1, 0), (0, 1)\} (standard) and C={(1,1),(1,1)}\mathcal{C} = \{(1, 1), (1, -1)\}.

To find PBCP_{\mathcal{B} \to \mathcal{C}}, express (1,0)(1, 0) and (0,1)(0, 1) in terms of C\mathcal{C}:

(1,0)=12(1,1)+12(1,1),(0,1)=12(1,1)12(1,1)(1, 0) = \frac{1}{2}(1, 1) + \frac{1}{2}(1, -1), \quad (0, 1) = \frac{1}{2}(1, 1) - \frac{1}{2}(1, -1)

So:

PBC=(1/21/21/21/2)P_{\mathcal{B} \to \mathcal{C}} = \begin{pmatrix} 1/2 & 1/2 \\ 1/2 & -1/2 \end{pmatrix}

6. Applications of Direct Sums and Quotients

Direct sums and quotient spaces are powerful tools for decomposing vector spaces and understanding their structure. They appear naturally in many areas of mathematics.

Theorem 6.1: Complementary Subspaces

For any subspace WW of a finite-dimensional vector space VV, there exists a subspace UU such that V=UWV = U \oplus W. Such a UU is called a complement of WW.

Proof:

Let {w1,,wk}\{w_1, \ldots, w_k\} be a basis for WW. Extend it to a basis {w1,,wk,v1,,vn}\{w_1, \ldots, w_k, v_1, \ldots, v_n\} for VV.

Then U=span{v1,,vn}U = \text{span}\{v_1, \ldots, v_n\} is a complement of WW.

Remark 6.1: Non-Uniqueness of Complements

Complements are not unique. For example, in R2\mathbb{R}^2, any line through the origin (other than the x-axis) is a complement of the x-axis.

Theorem 6.2: Isomorphism via Quotient

If V=UWV = U \oplus W, then V/WUV/W \cong U (isomorphic as vector spaces).

Proof:

Define ϕ:UV/W\phi: U \to V/W by ϕ(u)=[u]\phi(u) = [u] (the coset of uu).

This is linear and bijective: every coset [v][v] has a unique representative in UU (since v=u+wv = u + w uniquely with uU,wWu \in U, w \in W).

Example 6.1: Decomposing Function Spaces

Let V=C[0,1]V = C[0, 1] (continuous functions) and W={f:f(0)=0}W = \{f : f(0) = 0\}.

Then U={f:f is constant}U = \{f : f \text{ is constant}\} is a complement of WW, and V/WURV/W \cong U \cong \mathbb{R}.

Example 6.2: Direct Sum Decomposition of Matrices

In Mn(F)M_n(F), we have:

Mn(F)=Symn(F)Skewn(F)M_n(F) = \text{Sym}_n(F) \oplus \text{Skew}_n(F)

where Symn(F)\text{Sym}_n(F) = symmetric matrices and Skewn(F)\text{Skew}_n(F) = skew-symmetric matrices.

Any matrix AA decomposes as:

A=A+AT2+AAT2A = \frac{A + A^T}{2} + \frac{A - A^T}{2}
Theorem 6.3: Quotient and Dimension

The quotient space V/WV/W has dimension equal to the "codimension" of WW in VV, which is dim(V)dim(W)\dim(V) - \dim(W). This measures how many "directions" are orthogonal (in a loose sense) to WW.

Example 6.3: Quotient in Linear Algebra

For solving Ax=bAx = b, the solution set (if non-empty) is a coset x0+ker(A)x_0 + \ker(A) in Rn/ker(A)\mathbb{R}^n/\ker(A), where x0x_0 is any particular solution.

Frequently Asked Questions

Why is dimension well-defined?

The Steinitz exchange lemma shows that if you have a spanning set of size n and an independent set of size m, then m ≤ n. Applying this both ways to two bases shows they have the same size.

What's the difference between a basis and a spanning set?

A spanning set may have redundant vectors. A basis is a minimal spanning set—remove any vector and it no longer spans. Equivalently, it's a maximal independent set.

How do I find the dimension of a subspace?

Find a basis (e.g., by row reducing the matrix whose rows generate the subspace) and count the basis vectors. Or use the rank of the associated matrix.

What's the intuition for quotient spaces?

V/W 'collapses' W to a point. Elements of V/W are parallel copies of W. If W is a line through origin, V/W contains all lines parallel to W, each as a single 'point' in the quotient.

How do I verify that V = U ⊕ W?

Check two things: (1) U + W = V (every vector is a sum), and (2) U ∩ W = {0} (only overlap is zero). Equivalently, show every v has a UNIQUE decomposition as u + w.

Basis & Dimension Practice
10
Questions
0
Correct
0%
Accuracy
1
What is dim(R5)\dim(\mathbb{R}^5)?
Easy
Not attempted
2
What is dim(M2×3(R))\dim(M_{2 \times 3}(\mathbb{R}))?
Easy
Not attempted
3
What is dim(R3[x])\dim(\mathbb{R}_3[x]) (polynomials of degree ≤ 3)?
Easy
Not attempted
4
If dim(V)=n\dim(V) = n, how many vectors are in any basis of VV?
Medium
Not attempted
5
If V=UWV = U \oplus W, what is UWU \cap W?
Easy
Not attempted
6
If dim(V)=5\dim(V) = 5 and dim(W)=2\dim(W) = 2 where WVW \subseteq V, what is dim(V/W)\dim(V/W)?
Easy
Not attempted
7
Can 3 vectors form a basis for R4\mathbb{R}^4?
Medium
Not attempted
8
If UU and WW are subspaces with V=U+WV = U + W and dim(UW)=0\dim(U \cap W) = 0, then:
Medium
Not attempted
9
What is [v]+[u][v] + [u] in V/WV/W?
Easy
Not attempted
10
If V=UWV = U \oplus W and dimU=3\dim U = 3, dimW=4\dim W = 4, what is dimV\dim V?
Easy
Not attempted