A basis is a minimal spanning set and a maximal independent set. The number of elements in any basis is the dimension—the most important invariant of a vector space.
The concept of dimension evolved gradually through the 19th century. While geometric intuition suggested that lines are 1-dimensional, planes are 2-dimensional, and space is 3-dimensional, the abstract notion of dimension for arbitrary vector spaces was formalized by Hermann Grassmann (1844) and Giuseppe Peano (1888). The proof that dimension is well-defined—that all bases have the same size—relies on the Steinitz Exchange Lemma (1913), which Ernst Steinitz proved in his foundational work on field theory. This result is sometimes called the "Replacement Theorem" and is one of the most important theorems in linear algebra.
A basis is the "perfect" set of vectors for a vector space: it has exactly the right number of vectors to span the entire space without any redundancy. Think of it as a minimal set of building blocks from which every vector in the space can be uniquely constructed.
A basis of a vector space over a field is a set that satisfies:
A basis gives you:
For a finite set in a vector space , the following are equivalent:
(1) ⇒ (2): Since spans, every can be written as . If also , then . By independence, for all .
(2) ⇒ (3): Unique representation implies independence (the zero vector has only the trivial representation). Adding any creates dependence since already has a representation.
(3) ⇒ (4): If is maximal independent but doesn't span, there exists . Then is independent, contradicting maximality.
(4) ⇒ (1): Minimal spanning implies independence (if dependent, remove a redundant vector while maintaining spanning).
The standard basis of is:
where has 1 in position and 0 elsewhere.
For example, in :
Verification:
For = polynomials of degree ≤ , the standard basis is:
This has elements.
Example:
For , the standard basis consists of matrices with 1 in position and 0 elsewhere.
For :
Any matrix can be written as .
In , besides the standard basis , other bases include:
Key insight: Any two linearly independent vectors in form a basis.
Every linearly independent set in a finite-dimensional vector space can beextended to a basis of .
Let be linearly independent in . If , then is already a basis.
Otherwise, there exists . Then is still independent (by the extension criterion).
Repeat this process. Since is finite-dimensional, there's an upper bound on independent set sizes (any spanning set size). So the process terminates with a basis.
Every spanning set of a finite-dimensional vector space contains a basis.
Let span . If is independent, it's a basis.
Otherwise, some is a linear combination of others: .
Then still spans . Repeat until independent.
Every finite-dimensional vector space has a basis.
The zero space has the empty set as its (unique) basis. This is consistent: the empty set is vacuously independent, and its span is (by convention, the empty sum equals the zero vector).
For 2×2 upper triangular matrices over :
Standard basis:
.
For 2×2 matrices with trace 0:
General element:
(This is the Lie algebra ).
If are vectors in , form the matrix with these as rows and row reduce. The original vectors corresponding to pivot rows form a basis for.
Problem: Find a basis for .
Solution:
Pivots in rows 1 and 2. Basis: . Dimension = 2.
The dimension of a vector space is the number of vectors in any basis. The remarkable fact is that this number is the same for all bases—a consequence of the Steinitz Exchange Lemma. Dimension is the most important invariant of a vector space.
Any two bases of a finite-dimensional vector space have the same number of elements.
Let and be two bases with and elements respectively.
is independent and spans, so by the Fundamental Inequality: .
is independent and spans, so: .
Therefore .
The dimension of a finite-dimensional vector space , denoted or , is the number of elements in any basis of .
By convention: .
| Vector Space | Standard Basis | Dimension |
|---|---|---|
| over | 2 | |
| 0 |
Let . Then:
(1): If has independent vectors but doesn't span, extend to a basis of size n. Contradiction.
(2): If spans with vectors but is dependent, reduce to a basis of size n. Contradiction.
(3): By Fundamental Inequality: |independent| ≤ |spanning| = n.
(4): By Fundamental Inequality: |spanning| ≥ |independent| = n.
If is a subspace of a finite-dimensional space , then:
(1), (2): Any independent set in is independent in , so has size ≤ dim(V). Thus has a basis of finite size.
(3): If dim(W) = dim(V) = n, a basis of has independent vectors in , hence is a basis of . So span = V.
For subspaces and of :
Let be a basis of .
Extend to bases: of and of .
One can show is a basis for .
Count: .
In , let (xy-plane) and (yz-plane).
Indeed, .
A vector space is:
Examples of infinite-dimensional spaces:
For a linear map , the nullity is .
If is represented by matrix , then:
This is the rank-nullity theorem (covered in Part III).
For , the rank is .
Key relationship:
Two finite-dimensional vector spaces over the same field are isomorphic if and only if they have the same dimension.
(⇒) If is an isomorphism and is a basis of , then is a basis of (bijection preserves size).
(⇐) If , choose bases and . Define and extend linearly. This is an isomorphism.
Every -dimensional vector space over is isomorphic to . Thus, up to isomorphism, there is exactly one -dimensional space over for each .
For finite-dimensional spaces, dimension completely characterizes the isomorphism class. Two spaces are "the same" (structurally) iff they have the same dimension over the same field.
Once we fix a basis, every vector gets a unique set of coordinates—the coefficients in its representation as a linear combination of basis vectors. This gives us a powerful correspondence between abstract vectors and concrete column vectors in .
An ordered basis is a basis together with a specific ordering of its elements:
The ordering matters for defining coordinates.
Let be an ordered basis of . For any , write:
The coordinate vector of with respect to is:
Since is a basis, the representation isunique. So the coordinate vector is well-defined.
In with the standard basis :
For the standard basis, coordinates equal components.
In , let .
Find .
Solution: Solve :
So .
Let be an ordered basis of an -dimensional space . The map:
is a linear isomorphism.
Linear: If and , then:
Similarly for scalar multiplication.
Bijective:
Every -dimensional vector space over is isomorphic to .
The coordinate isomorphism lets us:
In with basis :
The coefficients of the polynomial become coordinates.
In with standard basis :
Let and be two ordered bases of . There exists an invertible matrix (the change of basis matrix) such that:
The columns of are .
Let and in .
Change of basis matrix:
Example: For :
For any ordered basis and vectors , scalar :
This is why coordinates give an isomorphism—they preserve all vector space operations.
In with :
Let and .
Find coordinates:
Add in coordinates: .
Verify: . ✓
If is linear and we fix bases , , then can be represented by a matrix such that:
This is covered in detail in Part III: Linear Mappings.
In with Lagrange interpolation basis at points :
For any polynomial of degree ≤ 2:
The coordinates are simply the function values! This is a key idea in interpolation.
Problem: Find a basis for .
Solution: Solve , so:
Basis: .
Dimension: .
Problem: Extend to a basis of .
Solution: Need 2 more independent vectors.
Try : Independent from ? Yes (not scalar multiple).
Try : Independent from both? Yes (third component non-zero).
Basis: .
Problem: Find dim(ker A) for:
Solution: Row reduce:
Rank = 2 (two pivots). By rank-nullity:
Problem: If spans , prove it's a basis.
Solution: We know .
Any 3 spanning vectors in a 3-dimensional space must be a basis (by dimension criteria).
Alternatively: if dependent, could reduce to a smaller spanning set, contradicting that minimum is 3.
Problem: In , let and be 3-dimensional subspaces. What are the possible values of ?
Solution: By dimension formula:
Since , we have .
So , giving .
Also (subspace of ).
Answer: .
Problem: Find a basis for symmetric 2×2 matrices.
Solution: Symmetric matrices have form .
Basis:
Dimension: 3.
Problem: Find where and .
Solution: Find such that:
Comparing coefficients:
So .
Answer: .
Problem: Find a basis for the row space of:
Solution: Row reduce:
Basis: . Dimension = 2.
Problem: Find dim of .
Solution: From equations: , .
Parametrize with free variables :
Basis: . dim(W) = 2.
Problem: Let and . Show .
Solution:
Step 1: Check :
Step 2: Dimensions:
Since and , we have .
Problem: If (direct sum), prove .
Solution: Direct sum means and .
By dimension formula:
Problem: Find the change of basis matrix from to .
Solution: Express vectors in terms of :
To convert coordinates: .
A spanning set may be larger than a basis. Example: spans but is NOT a basis (dependent, has 3 vectors in a 2D space).
has dimension , not ! The degree ≤ n polynomials include constants, so the basis has elements.
Coordinates are only unique when the representing set is a basis. For a dependent spanning set, the same vector can have multiple representations.
Coordinates depend on the ORDER of basis vectors. If , then. Swap the order and coordinates change!
NO! is NOT . Use the dimension formula:.
A basis must be linearly independent. Any set containing the zero vector is dependent, so can never be part of a basis.
Change of basis does NOT change dimension! The same space can be described with different bases, but the number of basis vectors is always the same (the dimension).
over is 1-dimensional, but over is 2-dimensional. Always specify the field!
To verify is a basis for :
Shortcut for n vectors in n-dim space:
A basis is the "perfect size": big enough to span, small enough to be independent.
All bases have the same size—this is not obvious but follows from the Steinitz Exchange Lemma.
In an -dimensional space: independent vectors = basis = spanning vectors.
With an ordered basis, every vector has unique coordinates—this is the key to computation.
Every -dimensional space is isomorphic to —abstract spaces become concrete!
| Vector Space | Dimension | Standard Basis |
|---|---|---|
| ℝⁿ | n | e₁, ..., eₙ |
| ℂ over ℝ | 2 | 1, i |
| Pₙ(F) | n + 1 | 1, x, x², ..., xⁿ |
| Mₘₓₙ(F) | mn | Eᵢⱼ |
| Symmetric n×n | n(n+1)/2 | Eᵢᵢ, Eᵢⱼ + Eⱼᵢ |
| Skew-symmetric n×n | n(n-1)/2 | Eᵢⱼ - Eⱼᵢ (i < j) |
3D coordinates use basis vectors. Change of basis rotates/transforms scenes. Homogeneous coordinates add a dimension for projective transformations.
SVD/PCA find low-dimensional subspaces capturing most variance. Dimension reduction = finding a good low-rank basis.
Solution spaces of linear ODEs are vector spaces. Dimension = order of equation. Finding basis solutions gives the general solution.
Quantum states live in Hilbert spaces. Observables have eigenbases. Measurements correspond to basis projections.
For a data matrix (m samples, n features):
In a network with nodes and edges:
Dimension is a complete invariant for finite-dimensional vector spaces: two spaces are isomorphic if and only if they have the same dimension. This is why dimension is often the first thing we compute about a vector space.
In machine learning, consider a dataset with 100 features:
Dimension reduction = finding a low-dimensional basis that captures most information.
A linear code is a subspace of :
In structural analysis, forces and displacements live in vector spaces:
With basis and dimension established, we can now represent linear maps as matrices. The choice of bases for domain and codomain determines the matrix representation. This connection between abstract linear maps and concrete matrices is one of the most powerful ideas in linear algebra.
Input: Spanning set
Complexity: O(mn²) for m vectors in n dimensions
Input: Independent set in
Result: Basis containing original vectors plus some standard vectors
✓ To find dimension of a subspace:
✓ To show dim(U) = dim(W):
✓ To show U = W (subspaces):
ℝⁿ: n | Pₙ(F): n+1 | Mₘₓₙ(F): mn | Symmetric n×n: n(n+1)/2
With basis and dimension established, we're ready for:
The dimension of domain and codomain determines the size of representing matrices!
The Steinitz exchange lemma shows that if you have a spanning set of size n and an independent set of size m, then m ≤ n. Applying this both ways to two bases shows they have the same size.
A spanning set may have redundant vectors. A basis is a minimal spanning set—remove any vector and it no longer spans. Equivalently, it's a maximal independent set.
Find a basis (e.g., by row reducing the matrix whose rows generate the subspace) and count the basis vectors. Or use the rank of the associated matrix.
Yes! There are infinitely many bases for any space of dimension ≥ 1. All bases have the same size but contain different vectors. Coordinates depend on basis choice.
Spaces like F[x] (all polynomials) or C[0,1] (continuous functions) are infinite-dimensional—no finite set can span them. They still have bases (by Zorn's lemma) but of infinite cardinality.
If you switch from basis B to basis B', there's a change of basis matrix P such that [v]_B' = P^{-1}[v]_B. The matrix P has columns that are the B-coordinates of the B' basis vectors.
The rank of a matrix equals the dimension of its column space (or row space). For a linear map T: V → W, rank(T) = dim(im(T)) and the rank-nullity theorem says dim(V) = dim(ker T) + dim(im T).
The zero space {0} has no basis vectors (the empty set is its basis), so dim({0}) = 0. This is consistent: any n vectors containing 0 are dependent, so you can't have a non-empty basis.
Yes, in finite-dimensional spaces. Start with an independent set, and if it doesn't span, add vectors one by one that aren't in the current span. This process terminates with a basis.
For subspaces U and W: dim(U + W) = dim(U) + dim(W) - dim(U ∩ W). This is the dimension formula for sums, analogous to inclusion-exclusion in counting.