
# 5.4  Dimension

We now come to the important definition of the dimension of a finite-dimensional vector space. Intuitively, we know that $$\mathbb{R}^2$$ has dimension 2, that $$\mathbb{R}^3$$ has dimension 3, and, more generally, that $$\mathbb{R}^n$$ has dimension $$n$$. This is precisely the length of every basis for each of these vector spaces, which prompts the following definition.

Definition 5.4.1. We call the length of any basis for $$V$$ (which is well-defined by Theorem 5.4.2 below) the dimension of $$V$$, and we denote this by $$\dim(V)$$.

Note that Definition 5.4.1 only makes sense if, in fact, every basis for a given finite-dimensional vector space has the same length. This is true by the following theorem.

Theorem 5.4.2.  Let $$V$$ be a finite-dimensional vector space. Then any two bases of $$V$$ have the same length.

Proof.

Let $$(v_1,\ldots,v_m)$$ and $$(w_1,\ldots,w_n)$$ be two bases of $$V$$. Both span $$V$$.

By Theorem 5.2.9, we have $$m\le n$$ since $$(v_1,\ldots,v_m)$$ is linearly independent. By the same theorem, we also have $$n\le m$$ since $$(w_1,\ldots,w_n)$$ is linearly independent. Hence $$n=m$$, as asserted.

Example 5.4.3.  $$\dim(\mathbb{F}^n)=n$$ and $$\dim(\mathbb{F}_m[z]) = m + 1$$. Note that $$\dim(\mathbb{C}^n)=n$$ as a complex vector space, whereas $$\dim(\mathbb{C}^n)=2n$$ as an real vector space. This comes from the fact that we can view $$\mathbb{C}$$ itself as an real vector space of dimension 2 with basis $$(1,i)$$.

Theorem 5.4.4.  Let $$V$$ be a finite-dimensional vector space with $$\dim(V)=n$$. Then:

1. If $$U\subset V$$ is a subspace of $$V$$, then $$\dim(U) \le \dim(V)$$.
2. If $$V=\Span(v_1,\ldots,v_n)$$, then $$(v_1,\ldots,v_n)$$ is a basis of $$V$$.
3. If $$(v_1,\ldots,v_n)$$ is linearly independent in $$V$$, then $$(v_1,\ldots,v_n)$$ is a basis of $$V$$.

Point 1 implies, in particular, that every subspace of a finite-dimensional vector space is finite-dimensional. Points 2 and 3 show that if the dimension of a vector space is known to be $$n$$, then, to check that a list of $$n$$ vectors is a basis, it is enough to check whether it spans $$V$$ (resp. is linearly independent).

Proof.

To prove Point~1, first note that $$U$$ is necessarily finite-dimensional (otherwise we could find a list of linearly independent vectors longer than $$\dim(V)$$ ). Therefore, by Corollary 5.3.6, $$U$$ has a basis $$(u_1,\ldots,u_m)$$ (say). This list is linearly independent in both $$U$$ and $$V$$. By the Basis Extension Theorem 5.3.7, we can extend $$(u_1,\ldots,u_m)$$ to a basis for $$V$$, which is of length $$n$$ since $$\dim(V)=n$$. This implies that $$m\le n$$, as desired.

To prove Point~2, suppose that $$(v_1,\ldots,v_n)$$ spans $$V$$. Then, by the Basis Reduction Theorem 5.3.4, this list can be reduced to a basis. However, every basis of $$V$$ has length $$n$$; hence, no vector needs to be removed from $$(v_1,\ldots,v_n)$$. It follows that $$(v_1,\ldots,v_n)$$ is already a basis of $$V$$.

Point~3 is proven in a very similar fashion. Suppose $$(v_1,\ldots,v_n)$$ is linearly independent. By the Basis Extension Theorem 5.3.7, this list can be extended to a basis. However, every basis has length $$n$$; hence, no vector needs to be added to $$(v_1,\ldots,v_n)$$. It follows that $$(v_1,\ldots,v_n)$$ is already a basis of $$V$$.

We conclude this chapter with some additional interesting results on bases and dimensions. The first one combines the concepts of basis and direct sum.

Theorem 5.4.5.  Let $$U\subset V$$ be a subspace of a finite-dimensional vector space $$V$$. Then there exists a subspace $$W\subset V$$ such that $$V=U\oplus W$$.

Proof.

Let $$(u_1,\ldots,u_m)$$ be a basis of $$U$$. By Theorem 5.4.4(1), we know that $$m\le \dim(V)$$. Hence, by the Basis Extension Theorem 5.3.7, $$(u_1,\ldots,u_m)$$ can be extended to a basis $$(u_1,\ldots,u_m,w_1,\ldots,w_n)$$ of $$V$$. Let $$W=\Span(w_1,\ldots,w_n)$$.

To show that $$V=U\oplus W$$, we need to show that $$V=U+W$$ and $$U\cap W=\{0\}$$. Since $$V=\Span(u_1,\ldots,u_m,w_1,\ldots,w_n)$$ where $$(u_1,\ldots,u_m)$$ spans $$U$$ and $$(w_1,\ldots,w_n)$$ spans $$W$$, it is clear that $$V=U+W$$.

To show that $$U\cap W=\{0\}$$, let $$v\in U\cap W$$. Then there exist scalars $$a_1,\ldots,a_m, b_1,\ldots,b_n\in\mathbb{F}$$ such that

$v=a_1 u_1+\cdots+ a_m u_m = b_1 w_1 + \cdots + b_n w_n,$

or equivalently that

$a_1 u_1+\cdots+ a_m u_m -b_1 w_1 - \cdots - b_n w_n =0.$

Since $$(u_1,\ldots,u_m,w_1,\ldots,w_n)$$ forms a basis of $$V$$ and hence is linearly independent, the only solution to this equation is $$a_1=\cdots=a_m=b_1=\cdots=b_n=0$$. Hence $$v=0$$, proving that indeed $$U\cap W=\{0\}$$.

Theorem 5.4.6.  If $$U,W\subset V$$ are subspaces of a finite-dimensional vector space, then

$\dim(U+W) = \dim(U) + \dim(W) - \dim(U\cap W).$

Proof.

Let $$(v_1,\ldots,v_n)$$ be a basis of $$U\cap W$$. By the Basis Extension

Theorem 5.3.7, there exist $$(u_1,\ldots,u_k)$$ and $$(w_1,\ldots,w_\ell)$$ such that $$(v_1,\ldots,v_n,u_1,\ldots,u_k)$$ is a basis of $$U$$ and $$(v_1,\ldots,v_n,w_1,\ldots,w_\ell)$$ is a basis of $$W$$. It suffices to show that

$\mathcal{B} = (v_1,\ldots,v_n,u_1,\ldots,u_k,w_1,\ldots,w_\ell)$

is a basis of $$U+W$$ since then

$\dim(U+W) = n+k+\ell= (n+k) + (n+\ell)-n=\dim(U) + \dim(W) -\dim(U\cap W).$

Clearly $$\Span(v_1,\ldots,v_n,u_1,\ldots,u_k,w_1,\ldots,w_\ell)$$ contains $$U$$ and $$W$$, and hence $$U+W$$. To show that $$\mathcal{B}$$ is a basis, it remains to show that $$\mathcal{B}$$ is linearly independent. Suppose

$a_1v_1+\cdots+a_n v_n + b_1u_1+\cdots +b_k u_k + c_1w_1+\cdots+c_\ell w_\ell =0, \tag{5.4.1}$

and let $$u=a_1v_1+\cdots+a_n v_n + b_1u_1+\cdots +b_k u_k\in U$$. Then, by Equation (5.4.1), we also have that $$u=-c_1 w_1-\cdots - c_\ell w_\ell\in W$$, which implies that $$u\in U\cap W$$. Hence, there exist scalars $$a_1',\ldots,a_n'\in\mathbb{F}$$ such that $$u=a_1'v_1+\cdots+a_n'v_n$$.

Since there is a unique linear combination of the linearly independent vectors $$(v_1,\ldots,v_n,u_1,\ldots,u_k)$$ that describes $$u$$, we must have $$b_1=\cdots=b_k=0$$ and $$a_1=a_1',\ldots,a_n=a_n'$$. Since $$(v_1,\ldots,v_n,w_1,\ldots,w_\ell)$$ is also linearly independent, it further follows that $$a_1=\cdots=a_n=c_1=\cdots=c_\ell=0$$. Hence, Equation (5.4.1) only has the trivial solution, which implies that $$\mathcal{B}$$ is a basis.

### Contributors

Both hardbound and softbound versions of this textbook are available online at WorldScientific.com.