MathIsimple
Course 2
Available

Subspaces & Linear Independence

Subspaces are vector spaces contained within larger vector spaces. Linear independence captures when vectors are genuinely different—none is redundant.

1. Subspaces

Definition 1.1: Subspace

A subset WW of a vector space VV over a field FF is a subspaceif WW is itself a vector space over FF under the operations inherited from VV.

Theorem 1.1: Subspace Criterion

A non-empty subset WW of a vector space VV is a subspace if and only if:

  1. For all u,vWu, v \in W, u+vWu + v \in W (closed under addition)
  2. For all vWv \in W and αF\alpha \in F, αvW\alpha v \in W (closed under scalar multiplication)
Proof:

(⇒) If WW is a subspace, it satisfies all vector space axioms, including closure.

(⇐) If closure holds, we verify:

  • Zero vector: 0=0vW0 = 0 \cdot v \in W for any vWv \in W
  • Additive inverses: v=(1)vW-v = (-1) \cdot v \in W
  • All other axioms are inherited from VV
Example 1.1: Subspace Examples
  • {(x,y)R2:x=2y}\{(x, y) \in \mathbb{R}^2 : x = 2y\} is a subspace of R2\mathbb{R}^2
  • The set of all n×nn \times n symmetric matrices is a subspace of Mn(R)M_n(\mathbb{R})
  • The set of polynomials of degree at most nn is a subspace of R[x]\mathbb{R}[x]
Example 1.3: More Subspace Examples
  • The set of all m×nm \times n matrices with trace zero is a subspace of Mm×n(F)M_{m \times n}(F) (for square matrices)
  • The set of continuous functions C[a,b]C[a, b] is a subspace of the vector space of all functions from [a,b][a, b] to R\mathbb{R}
  • The set of solutions to a homogeneous linear system Ax=0Ax = 0 is a subspace of FnF^n
  • The set of all upper triangular n×nn \times n matrices is a subspace of Mn(F)M_n(F)
Theorem 1.3: One-Step Subspace Test

A non-empty subset WW of a vector space VV is a subspace if and only if for all u,vWu, v \in W and αF\alpha \in F, we have:

αu+vW\alpha u + v \in W
Proof:

(⇒) If WW is a subspace, it's closed under addition and scalar multiplication, so αu+vW\alpha u + v \in W.

(⇐) If αu+vW\alpha u + v \in W for all u,vWu, v \in W and αF\alpha \in F:

  • Set α=0\alpha = 0 to get 0u+v=vW0 \cdot u + v = v \in W (trivial, but confirms WW is non-empty)
  • Set v=0v = 0 to get αu+0=αuW\alpha u + 0 = \alpha u \in W (closure under scalar multiplication)
  • Set α=1\alpha = 1 to get 1u+v=u+vW1 \cdot u + v = u + v \in W (closure under addition)
  • Set α=1,v=0\alpha = -1, v = 0 to get uW-u \in W (additive inverses)
Definition 1.2: Linear Span

The linear span (or just span) of a set S={v1,,vk}S = \{v_1, \ldots, v_k\}is the set of all linear combinations:

span(S)={i=1kαivi:αiF}\text{span}(S) = \left\{\sum_{i=1}^k \alpha_i v_i : \alpha_i \in F\right\}
Theorem 1.2: Span is a Subspace

For any set SS in a vector space VV, span(S)\text{span}(S) is a subspace of VV.

Example 1.2: Span Examples
  • span{(1,0),(0,1)}=R2\text{span}\{(1, 0), (0, 1)\} = \mathbb{R}^2
  • span{1,x,x2}\text{span}\{1, x, x^2\} is the space of polynomials of degree at most 2
  • span{0}={0}\text{span}\{0\} = \{0\}
Theorem 1.4: Smallest Subspace Containing a Set

For any set SS in a vector space VV, span(S)\text{span}(S) is the smallest subspace of VV containing SS. That is, if WW is any subspace containing SS, then span(S)W\text{span}(S) \subseteq W.

Proof:

Since WW is a subspace containing SS, it must contain all linear combinations of vectors in SS (by closure). Therefore, span(S)W\text{span}(S) \subseteq W.

2. Linear Independence

Definition 2.1: Linear Dependence and Independence

A set of vectors {v1,,vk}\{v_1, \ldots, v_k\} is linearly dependent if there exist scalars α1,,αk\alpha_1, \ldots, \alpha_k, not all zero, such that:

α1v1+α2v2++αkvk=0\alpha_1 v_1 + \alpha_2 v_2 + \cdots + \alpha_k v_k = 0

A set is linearly independent if it is not linearly dependent, i.e., the only solution to the above equation is α1=α2==αk=0\alpha_1 = \alpha_2 = \cdots = \alpha_k = 0.

Example 2.1: Independent Vectors

The vectors (1,0)(1, 0) and (0,1)(0, 1) in R2\mathbb{R}^2 are linearly independent:

If a(1,0)+b(0,1)=(0,0)a(1, 0) + b(0, 1) = (0, 0), then (a,b)=(0,0)(a, b) = (0, 0), so a=b=0a = b = 0.

Example 2.2: Dependent Vectors

The vectors (1,2,3)(1, 2, 3), (4,5,6)(4, 5, 6), (7,8,9)(7, 8, 9) are linearly dependent:

Note that (7,8,9)=2(4,5,6)(1,2,3)(7, 8, 9) = 2(4, 5, 6) - (1, 2, 3), so:

1(1,2,3)+2(4,5,6)1(7,8,9)=0-1 \cdot (1, 2, 3) + 2 \cdot (4, 5, 6) - 1 \cdot (7, 8, 9) = 0
Theorem 2.1: Properties of Linear Independence

Let {v1,,vk}\{v_1, \ldots, v_k\} be a set of vectors.

  1. If the set contains the zero vector, it is linearly dependent
  2. If one vector is a scalar multiple of another, the set is linearly dependent
  3. Any subset of a linearly independent set is linearly independent
  4. If vspan{v1,,vk}v \in \text{span}\{v_1, \ldots, v_k\} and the viv_i are independent, then {v1,,vk,v}\{v_1, \ldots, v_k, v\} is dependent
Proof of Theorem 2.1 (Property 1):

If 0{v1,,vk}0 \in \{v_1, \ldots, v_k\}, say v1=0v_1 = 0, then:

1v1+0v2++0vk=10=01 \cdot v_1 + 0 \cdot v_2 + \cdots + 0 \cdot v_k = 1 \cdot 0 = 0

This is a non-trivial combination (coefficient of v1v_1 is 1 ≠ 0), so the set is dependent.

Theorem 2.2: Computational Test for Linear Independence

Vectors v1,,vkv_1, \ldots, v_k in FnF^n are linearly independent if and only if the matrix A=[v1v2vk]A = [v_1 | v_2 | \cdots | v_k] (with these vectors as columns) has full column rank, i.e., rank(A)=k\text{rank}(A) = k.

Proof:

The vectors are independent iff the homogeneous system Ax=0Ax = 0 has only the trivial solution, which occurs iff every column has a pivot (no free variables), i.e., rank(A)=k\text{rank}(A) = k.

Example 2.3: Testing Independence via Row Reduction

Test if v1=(1,2,3),v2=(4,5,6),v3=(7,8,9)v_1 = (1, 2, 3), v_2 = (4, 5, 6), v_3 = (7, 8, 9) are independent:

Form the matrix and row reduce:

(147258369)(147036000)\begin{pmatrix} 1 & 4 & 7 \\ 2 & 5 & 8 \\ 3 & 6 & 9 \end{pmatrix} \to \begin{pmatrix} 1 & 4 & 7 \\ 0 & -3 & -6 \\ 0 & 0 & 0 \end{pmatrix}

Only 2 pivots, so rank=2<3\text{rank} = 2 < 3. The vectors are dependent.

Remark 2.1: Characterization of Dependence

A set {v1,,vk}\{v_1, \ldots, v_k\} is linearly dependent if and only if at least one vector can be written as a linear combination of the others. This provides an alternative characterization of dependence.

3. Span and Independence

Theorem 3.1: Fundamental Bound

In any vector space, if {v1,,vm}\{v_1, \ldots, v_m\} is linearly independent and{w1,,wn}\{w_1, \ldots, w_n\} spans the space, then mnm \leq n.

Remark 3.1: Maximal Independent Sets

This theorem shows that linearly independent sets cannot exceed the size of spanning sets. A basis is a set that is both linearly independent and spanning—it achieves the maximum size for an independent set.

Theorem 3.2: Uniqueness of Representation in Span

If {v1,,vk}\{v_1, \ldots, v_k\} is linearly independent, then every vector in span{v1,,vk}\text{span}\{v_1, \ldots, v_k\} can be written as a linear combination of the viv_i in exactly one way.

Proof:

Suppose v=αivi=βiviv = \sum \alpha_i v_i = \sum \beta_i v_i. Then:

(αiβi)vi=0\sum (\alpha_i - \beta_i) v_i = 0

By independence, αiβi=0\alpha_i - \beta_i = 0 for all ii, so αi=βi\alpha_i = \beta_i.

Corollary 3.1: Dependence and Redundancy

A set {v1,,vk}\{v_1, \ldots, v_k\} is linearly dependent if and only if at least one vector is in the span of the others, i.e., there exists ii such that vispan{vj:ji}v_i \in \text{span}\{v_j : j \neq i\}.

4. Steinitz Exchange Lemma

The Steinitz Exchange Lemma is a fundamental result that allows us to replace vectors in a spanning set with vectors from an independent set, maintaining the spanning property. This lemma is crucial for proving that all bases of a finite-dimensional vector space have the same size.

Theorem 4.1: Steinitz Exchange Lemma

Let VV be a vector space. If {u1,,um}\{u_1, \ldots, u_m\} spans VV and {v1,,vn}\{v_1, \ldots, v_n\} is linearly independent, then:

  1. nmn \leq m
  2. After reordering the uiu_i if necessary, the set {v1,,vn,un+1,,um}\{v_1, \ldots, v_n, u_{n+1}, \ldots, u_m\} spans VV.
Proof:

We proceed by induction on nn.

Base case (n = 0): Trivial (empty set is independent, and {u1,,um}\{u_1, \ldots, u_m\} spans VV).

Inductive step: Assume the result holds for n1n-1.

Since {u1,,um}\{u_1, \ldots, u_m\} spans VV, we can write:

vn=i=1mαiuiv_n = \sum_{i=1}^m \alpha_i u_i

Since {v1,,vn}\{v_1, \ldots, v_n\} is independent, vnv_n is not in the span of {v1,,vn1}\{v_1, \ldots, v_{n-1}\}, so at least one αi0\alpha_i \neq 0. Without loss of generality, assume α10\alpha_1 \neq 0.

Then u1=1α1(vni=2mαiui)u_1 = \frac{1}{\alpha_1}(v_n - \sum_{i=2}^m \alpha_i u_i), so u1u_1 is in the span of {vn,u2,,um}\{v_n, u_2, \ldots, u_m\}.

By the inductive hypothesis applied to {v1,,vn1}\{v_1, \ldots, v_{n-1}\} and the spanning set {u1,,um}\{u_1, \ldots, u_m\}, we can exchange to get that {v1,,vn1,un,,um}\{v_1, \ldots, v_{n-1}, u_n', \ldots, u_m'\} spans VV (where some uiu_i have been replaced).

Since u1u_1 is in the span of {vn,u2,,um}\{v_n, u_2, \ldots, u_m\}, we can replace u1u_1 with vnv_n in the spanning set, giving {v1,,vn,u2,,um}\{v_1, \ldots, v_n, u_2, \ldots, u_m\} spans VV.

This proves (2). For (1), if n>mn > m, we could exchange all mm vectors, leaving vm+1,,vnv_{m+1}, \ldots, v_n in the span of {v1,,vm}\{v_1, \ldots, v_m\}, contradicting independence.

Corollary 4.1: All Bases Have Same Size

In a finite-dimensional vector space, all bases have the same number of elements. This common number is called the dimension of the vector space.

Proof:

Let B1\mathcal{B}_1 and B2\mathcal{B}_2 be two bases. Since B1\mathcal{B}_1 spans and B2\mathcal{B}_2 is independent, by Steinitz, B2B1|\mathcal{B}_2| \leq |\mathcal{B}_1|. Reversing roles, B1B2|\mathcal{B}_1| \leq |\mathcal{B}_2|. Therefore, B1=B2|\mathcal{B}_1| = |\mathcal{B}_2|.

Example 4.1: Exchange Process

Let V=R3V = \mathbb{R}^3, {u1=(1,0,0),u2=(0,1,0),u3=(0,0,1)}\{u_1 = (1,0,0), u_2 = (0,1,0), u_3 = (0,0,1)\} spans VV, and {v1=(1,1,0),v2=(0,1,1)}\{v_1 = (1,1,0), v_2 = (0,1,1)\} is independent.

We can write v1=1u1+1u2+0u3v_1 = 1 \cdot u_1 + 1 \cdot u_2 + 0 \cdot u_3. Exchange u1u_1 with v1v_1:

{v1,u2,u3}\{v_1, u_2, u_3\} spans VV.

Now v2=0v1+1u2+1u3v_2 = 0 \cdot v_1 + 1 \cdot u_2 + 1 \cdot u_3. Exchange u2u_2 with v2v_2:

{v1,v2,u3}\{v_1, v_2, u_3\} spans VV.

5. Maximal Independent Sets and Minimal Spanning Sets

A basis can be characterized in two equivalent ways: as a maximal linearly independent set, or as a minimal spanning set. These characterizations are fundamental to understanding the structure of vector spaces.

Definition 5.1: Maximal Independent Set

A linearly independent set SS in a vector space VV is maximal if adding any vector from VV to SS makes it linearly dependent.

Definition 5.2: Minimal Spanning Set

A spanning set SS for a vector space VV is minimal if removing any vector from SS makes it no longer span VV.

Theorem 5.1: Basis as Maximal Independent Set

A set B\mathcal{B} is a basis for VV if and only if it is a maximal linearly independent set.

Proof:

(⇒) If B\mathcal{B} is a basis, it spans VV. Adding any vVv \in V gives vspan(B)v \in \text{span}(\mathcal{B}), so B{v}\mathcal{B} \cup \{v\} is dependent. Thus B\mathcal{B} is maximal.

(⇐) If B\mathcal{B} is maximal independent, then for any vVv \in V, B{v}\mathcal{B} \cup \{v\} is dependent. This means vspan(B)v \in \text{span}(\mathcal{B}), so B\mathcal{B} spans VV. Therefore, B\mathcal{B} is a basis.

Theorem 5.2: Basis as Minimal Spanning Set

A set B\mathcal{B} is a basis for VV if and only if it is a minimal spanning set.

Proof:

(⇒) If B\mathcal{B} is a basis, it's independent. Removing any vBv \in \mathcal{B} gives a set that doesn't span (since vv is not in the span of the others by independence). Thus B\mathcal{B} is minimal.

(⇐) If B\mathcal{B} is minimal spanning, then removing any vector makes it not span. This means no vector is redundant, i.e., no vector is in the span of the others. Therefore, B\mathcal{B} is independent, hence a basis.

Example 5.1: Maximal Independent Set

In R3\mathbb{R}^3, the set {(1,0,0),(0,1,0)}\{(1,0,0), (0,1,0)\} is independent but not maximal (we can add (0,0,1)(0,0,1)).

The set {(1,0,0),(0,1,0),(0,0,1)}\{(1,0,0), (0,1,0), (0,0,1)\} is maximal independent (and is a basis).

Example 5.2: Minimal Spanning Set

In R3\mathbb{R}^3, the set {(1,0,0),(0,1,0),(0,0,1),(1,1,1)}\{(1,0,0), (0,1,0), (0,0,1), (1,1,1)\} spans but is not minimal (we can remove (1,1,1)(1,1,1)).

The set {(1,0,0),(0,1,0),(0,0,1)}\{(1,0,0), (0,1,0), (0,0,1)\} is minimal spanning (and is a basis).

Theorem 5.3: Extending Independent Sets to Bases

In a finite-dimensional vector space, any linearly independent set can be extended to a basis. That is, if {v1,,vk}\{v_1, \ldots, v_k\} is independent, there exist vectors vk+1,,vnv_{k+1}, \ldots, v_n such that {v1,,vn}\{v_1, \ldots, v_n\} is a basis.

Proof:

Start with a basis {u1,,un}\{u_1, \ldots, u_n\}. By Steinitz Exchange Lemma, we can replace kk of the uiu_i with v1,,vkv_1, \ldots, v_k to get a spanning set. The remaining vectors complete the basis.

Theorem 5.4: Reducing Spanning Sets to Bases

In a finite-dimensional vector space, any spanning set can be reduced to a basis. That is, if {v1,,vm}\{v_1, \ldots, v_m\} spans VV, there exists a subset that is a basis.

Proof:

If the set is independent, it's already a basis. Otherwise, some vector is a linear combination of the others. Remove it. Repeat until the set is independent. The resulting set still spans (since we only removed redundant vectors) and is independent, hence a basis.

Frequently Asked Questions

What's the quickest way to check if a set is a subspace?

Use the one-step subspace test: W is a subspace iff for all u, v ∈ W and α ∈ F, we have αu + v ∈ W. This combines closure under addition and scalar multiplication, plus automatically includes 0 (set α = 0).

What's the difference between span and subspace?

Every span is a subspace, but not every subspace is described as a span initially. Span{S} is the smallest subspace containing S—it's the set of all linear combinations of vectors in S.

What's the intuition for linear independence?

Vectors are independent if none of them is 'redundant'—none can be expressed as a combination of the others. Each adds a genuinely new direction. Dependent vectors have overlap in the directions they describe.

How do I test for linear independence computationally?

Form a matrix with the vectors as columns and row reduce. The vectors are independent iff every column has a pivot (no free variables), equivalently, iff the matrix has full column rank.

Can the zero vector be part of an independent set?

No. If 0 is in the set, then 1·0 = 0 is a non-trivial combination equaling zero, making the set dependent.

Subspaces & Linear Independence Practice
10
Questions
0
Correct
0%
Accuracy
1
Is the set {(x,y)R2:x=2y}\{(x, y) \in \mathbb{R}^2 : x = 2y\} a subspace of R2\mathbb{R}^2?
Easy
Not attempted
2
Is {(x,y)R2:x+y=1}\{(x, y) \in \mathbb{R}^2 : x + y = 1\} a subspace of R2\mathbb{R}^2?
Easy
Not attempted
3
What is span{(1,0),(0,1)}\text{span}\{(1, 0), (0, 1)\} in R2\mathbb{R}^2?
Easy
Not attempted
4
Are the vectors (1,0)(1, 0) and (0,1)(0, 1) linearly independent in R2\mathbb{R}^2?
Easy
Not attempted
5
Are (1,2,3)(1, 2, 3), (4,5,6)(4, 5, 6), (7,8,9)(7, 8, 9) linearly independent in R3\mathbb{R}^3?
Medium
Not attempted
6
What is the maximum number of linearly independent vectors in R4\mathbb{R}^4?
Easy
Not attempted
7
If {v1,v2,v3}\{v_1, v_2, v_3\} is linearly independent, is {v1,v2}\{v_1, v_2\} linearly independent?
Medium
Not attempted
8
If UU and WW are subspaces of VV, is UWU \cup W always a subspace?
Medium
Not attempted
9
If a set contains the zero vector, is it linearly independent?
Easy
Not attempted
10
What is span{0}\text{span}\{0\}?
Easy
Not attempted