Subspaces are vector spaces contained within larger vector spaces. Linear independence captures when vectors are genuinely different—none is redundant.
A subset of a vector space over a field is a subspaceif is itself a vector space over under the operations inherited from .
A non-empty subset of a vector space is a subspace if and only if:
(⇒) If is a subspace, it satisfies all vector space axioms, including closure.
(⇐) If closure holds, we verify:
A non-empty subset of a vector space is a subspace if and only if for all and , we have:
(⇒) If is a subspace, it's closed under addition and scalar multiplication, so .
(⇐) If for all and :
The linear span (or just span) of a set is the set of all linear combinations:
For any set in a vector space , is a subspace of .
For any set in a vector space , is the smallest subspace of containing . That is, if is any subspace containing , then .
Since is a subspace containing , it must contain all linear combinations of vectors in (by closure). Therefore, .
A set of vectors is linearly dependent if there exist scalars , not all zero, such that:
A set is linearly independent if it is not linearly dependent, i.e., the only solution to the above equation is .
The vectors and in are linearly independent:
If , then , so .
The vectors , , are linearly dependent:
Note that , so:
Let be a set of vectors.
If , say , then:
This is a non-trivial combination (coefficient of is 1 ≠ 0), so the set is dependent.
Vectors in are linearly independent if and only if the matrix (with these vectors as columns) has full column rank, i.e., .
The vectors are independent iff the homogeneous system has only the trivial solution, which occurs iff every column has a pivot (no free variables), i.e., .
Test if are independent:
Form the matrix and row reduce:
Only 2 pivots, so . The vectors are dependent.
A set is linearly dependent if and only if at least one vector can be written as a linear combination of the others. This provides an alternative characterization of dependence.
In any vector space, if is linearly independent and spans the space, then .
This theorem shows that linearly independent sets cannot exceed the size of spanning sets. A basis is a set that is both linearly independent and spanning—it achieves the maximum size for an independent set.
If is linearly independent, then every vector in can be written as a linear combination of the in exactly one way.
Suppose . Then:
By independence, for all , so .
A set is linearly dependent if and only if at least one vector is in the span of the others, i.e., there exists such that .
The Steinitz Exchange Lemma is a fundamental result that allows us to replace vectors in a spanning set with vectors from an independent set, maintaining the spanning property. This lemma is crucial for proving that all bases of a finite-dimensional vector space have the same size.
Let be a vector space. If spans and is linearly independent, then:
We proceed by induction on .
Base case (n = 0): Trivial (empty set is independent, and spans ).
Inductive step: Assume the result holds for .
Since spans , we can write:
Since is independent, is not in the span of , so at least one . Without loss of generality, assume .
Then , so is in the span of .
By the inductive hypothesis applied to and the spanning set , we can exchange to get that spans (where some have been replaced).
Since is in the span of , we can replace with in the spanning set, giving spans .
This proves (2). For (1), if , we could exchange all vectors, leaving in the span of , contradicting independence.
In a finite-dimensional vector space, all bases have the same number of elements. This common number is called the dimension of the vector space.
Let and be two bases. Since spans and is independent, by Steinitz, . Reversing roles, . Therefore, .
Let , spans , and is independent.
We can write . Exchange with :
spans .
Now . Exchange with :
spans .
A basis can be characterized in two equivalent ways: as a maximal linearly independent set, or as a minimal spanning set. These characterizations are fundamental to understanding the structure of vector spaces.
A linearly independent set in a vector space is maximal if adding any vector from to makes it linearly dependent.
A spanning set for a vector space is minimal if removing any vector from makes it no longer span .
A set is a basis for if and only if it is a maximal linearly independent set.
(⇒) If is a basis, it spans . Adding any gives , so is dependent. Thus is maximal.
(⇐) If is maximal independent, then for any , is dependent. This means , so spans . Therefore, is a basis.
A set is a basis for if and only if it is a minimal spanning set.
(⇒) If is a basis, it's independent. Removing any gives a set that doesn't span (since is not in the span of the others by independence). Thus is minimal.
(⇐) If is minimal spanning, then removing any vector makes it not span. This means no vector is redundant, i.e., no vector is in the span of the others. Therefore, is independent, hence a basis.
In , the set is independent but not maximal (we can add ).
The set is maximal independent (and is a basis).
In , the set spans but is not minimal (we can remove ).
The set is minimal spanning (and is a basis).
In a finite-dimensional vector space, any linearly independent set can be extended to a basis. That is, if is independent, there exist vectors such that is a basis.
Start with a basis . By Steinitz Exchange Lemma, we can replace of the with to get a spanning set. The remaining vectors complete the basis.
In a finite-dimensional vector space, any spanning set can be reduced to a basis. That is, if spans , there exists a subset that is a basis.
If the set is independent, it's already a basis. Otherwise, some vector is a linear combination of the others. Remove it. Repeat until the set is independent. The resulting set still spans (since we only removed redundant vectors) and is independent, hence a basis.
Use the one-step subspace test: W is a subspace iff for all u, v ∈ W and α ∈ F, we have αu + v ∈ W. This combines closure under addition and scalar multiplication, plus automatically includes 0 (set α = 0).
Every span is a subspace, but not every subspace is described as a span initially. Span{S} is the smallest subspace containing S—it's the set of all linear combinations of vectors in S.
Vectors are independent if none of them is 'redundant'—none can be expressed as a combination of the others. Each adds a genuinely new direction. Dependent vectors have overlap in the directions they describe.
Form a matrix with the vectors as columns and row reduce. The vectors are independent iff every column has a pivot (no free variables), equivalently, iff the matrix has full column rank.
No. If 0 is in the set, then 1·0 = 0 is a non-trivial combination equaling zero, making the set dependent.