A basis is a minimal spanning set and a maximal independent set. The number of elements in any basis is the dimension—the most important invariant of a vector space.
A basis of a vector space over a field is a set that satisfies:
For a finite set in a vector space , the following are equivalent:
The standard basis of is:
where has 1 in position and 0 elsewhere.
For = polynomials of degree ≤ , the standard basis is:
This has elements.
Every finite-dimensional vector space has a basis. Moreover, any linearly independent set can be extended to a basis, and any spanning set can be reduced to a basis.
Start with any spanning set (which exists since the space is finite-dimensional). If it's not independent, remove redundant vectors until it is. The resulting set is a basis.
For extension: start with the independent set, and if it doesn't span, add vectors from a spanning set using the Steinitz Exchange Lemma.
If is a basis for , then every vector has a unique representation:
The scalars are called the coordinates of with respect to .
Existence follows from spanning. Uniqueness: if , then . By independence, for all .
All bases of a finite-dimensional vector space have the same number of elements.
Let and be two bases.
Since is independent and spans, by the fundamental bound, .
Since is independent and spans, .
Therefore .
The dimension of a finite-dimensional vector space , denoted , is the number of elements in any basis of .
If has no finite basis, we say is infinite-dimensional.
If is a subspace of a finite-dimensional vector space , then:
If and are subspaces of a finite-dimensional vector space , then:
Let be a basis for . Extend it to bases for and for .
Then spans and is independent (since vectors from and can only combine to zero if they're in ).
So .
If , then (since has dimension 0).
Let be subspaces of . The sum is:
We say is the internal direct sum of subspaces and , written:
if both conditions hold:
if and only if every can be written uniquely as:
In , let:
Then .
If , then:
Define if and only if .
This is an equivalence relation:
The quotient space is the set of all cosets:
with vector space operations:
The operations on are well-defined: they don't depend on the choice of coset representatives.
Suppose and .
Then and .
Addition: .
So . ✓
Scalar multiplication: .
So . ✓
If is a subspace of a finite-dimensional vector space , then:
Let be a basis for . Extend it to a basis for .
Then is a basis for :
Therefore, .
Let and (the xy-plane).
Then consists of all planes parallel to the xy-plane. Each such plane is a single "point" in .
A basis for is , so .
For a linear map , the quotient space is isomorphic to (this is the First Isomorphism Theorem, to be covered later).
Once we fix a basis, every vector has a unique coordinate representation. Changing the basis changes the coordinates, and understanding this relationship is crucial for many applications.
Let be a basis for . For , the coordinate vector of with respect to is:
In with standard basis , the vector has coordinate vector:
Let and be two bases for . The change of basis matrix from to is:
That is, the columns are the coordinate vectors of the basis vectors with respect to .
For any vector :
If , then:
Let (standard) and .
To find , express and in terms of :
So:
Direct sums and quotient spaces are powerful tools for decomposing vector spaces and understanding their structure. They appear naturally in many areas of mathematics.
For any subspace of a finite-dimensional vector space , there exists a subspace such that . Such a is called a complement of .
Let be a basis for . Extend it to a basis for .
Then is a complement of .
Complements are not unique. For example, in , any line through the origin (other than the x-axis) is a complement of the x-axis.
If , then (isomorphic as vector spaces).
Define by (the coset of ).
This is linear and bijective: every coset has a unique representative in (since uniquely with ).
Let (continuous functions) and .
Then is a complement of , and .
In , we have:
where = symmetric matrices and = skew-symmetric matrices.
Any matrix decomposes as:
The quotient space has dimension equal to the "codimension" of in , which is . This measures how many "directions" are orthogonal (in a loose sense) to .
For solving , the solution set (if non-empty) is a coset in , where is any particular solution.
The Steinitz exchange lemma shows that if you have a spanning set of size n and an independent set of size m, then m ≤ n. Applying this both ways to two bases shows they have the same size.
A spanning set may have redundant vectors. A basis is a minimal spanning set—remove any vector and it no longer spans. Equivalently, it's a maximal independent set.
Find a basis (e.g., by row reducing the matrix whose rows generate the subspace) and count the basis vectors. Or use the rank of the associated matrix.
V/W 'collapses' W to a point. Elements of V/W are parallel copies of W. If W is a line through origin, V/W contains all lines parallel to W, each as a single 'point' in the quotient.
Check two things: (1) U + W = V (every vector is a sum), and (2) U ∩ W = {0} (only overlap is zero). Equivalently, show every v has a UNIQUE decomposition as u + w.