In chapter 10, the notions of a linearly independent set of vectors in a vector space , and of a set of vectors that span were established: Any set of vectors that span can be reduced to some minimal collection of linearly independent vectors; such a set is called a \emph{basis} of the subspace .
Definitions
Let be a vector space.
- Then a set is a for if is linearly independent and .
- If is a basis of and has only finitely many elements, then we say that is .
- The number of vectors in is the of .
Suppose is a vector space, and and are two different bases for . One might worry that and have a different number of vectors; then we would have to talk about the dimension of in terms of the basis or in terms of the basis . Luckily this isn't what happens. Later in this chapter, we will show that and must have the same number of vectors. This means that the dimension of a vector space is basis-independent. In fact, dimension is a very important characteristic of a vector space.
Example :
(polynomials in of degree or less) has a basis , since every vector in this space is a sum
so . This set of vectors is linearly independent: If the polynomial , then , so is the zero polynomial. Thus is finite dimensional, and .
Theorem
Let be a basis for a vector space . Then every vector can be written as a linear combination of vectors in the basis :
Proof
Since is a basis for , then , and so there exist constants such that .
Suppose there exists a second set of constants such that
Then:
If it occurs exactly once that , then the equation reduces to , which is a contradiction since the vectors are assumed to be non-zero.
If we have more than one for which , we can use this last equation to write one of the vectors in as a linear combination of other vectors in , which contradicts the assumption that is linearly independent. Then for every , .
Remark
This theorem is the one that makes bases so useful--they allow us to convert abstract vectors into column vectors. By ordering the set we obtain and can write
Remember that in general it makes no sense to drop the subscript on the column vector on the right--most vector spaces are not made from columns of numbers!
Next, we would like to establish a method for determining whether a collection of vectors forms a basis for . But first, we need to show that any two bases for a finite-dimensional vector space has the same number of vectors.
Lemma
If is a basis for a vector space and is a linearly independent set of vectors in , then .

The idea of the proof is to start with the set and replace vectors in one at a time with vectors from , such that after each replacement we still have a basis for .
Proof
Since spans , then the set is linearly dependent. Then we can write as a linear combination of the ; using that equation, we can express one of the in terms of and the remaining with . Then we can discard one of the from this set to obtain a linearly independent set that still spans . Now we need to prove that is a basis; we must show that is linearly independent and that spans .
The set is linearly independent: By the previous theorem, there was a unique way to express in terms of the set . Now, to obtain a contradiction, suppose there is some and constants such that
Then replacing with its expression in terms of the collection gives a way to express the vector as a linear combination of the vectors in , which contradicts the linear independence of . On the other hand, we cannot express as a linear combination of the vectors in , since the expression of in terms of was unique, and had a non-zero coefficient for the vector . Then no vector in can be expressed as a combination of other vectors in , which demonstrates that is linearly independent.
The set spans : For any , we can express as a linear combination of vectors in . But we can express as a linear combination of vectors in the collection ; rewriting as such allows us to express as a linear combination of the vectors in . Thus is a basis of with vectors.
We can now iterate this process, replacing one of the in with , and so on. If , this process ends with the set , , which is fine.
Otherwise, we have , and the set is a basis for . But we still have some vector in that is not in . Since is a basis, we can write as a combination of the vectors in , which contradicts the linear independence of the set . Then it must be the case that , as desired.
Corollary
For a finite-dimensional vector space , any two bases for have the same number of vectors.
Proof
Let and be two bases for . Then both are linearly independent sets that span . Suppose has vectors and has vectors. Then by the previous lemma, we have that . But (exchanging the roles of and in application of the lemma) we also see that . Then , as desired.
Contributor
Thumbnail: A linear combination of one basis set of vectors (purple) obtains new vectors (red). If they are linearly independent, these form a new basis set. The linear combinations relating the first set to the other extend to a linear transformation, called the change of basis. (CC0; Maschen via Wikipedia)