3.5: Vector Spaces of a Matrix
3.5.1 Null Space
The null space of a matrix \(\text{A}\) is the vector space spanned by all vectors \(\text{x}\) that satisfy the matrix equation
\[\text{Ax}=0.\nonumber \]
If the matrix \(\text{A}\) is \(m\)-by-\(n\), then the column vector \(\text{x}\) is \(n\)-by-one and the null space of \(\text{A}\) is a subspace of \(\mathbb{R}^n\). If \(\text{A}\) is a square invertible matrix, then the null space consists of just the zero vector.
To find a basis for the null space of a noninvertible matrix, we bring \(\text{A}\) to row reduced echelon form. We demonstrate by example. Consider the three-by-five matrix given by
\[\text{A}=\left(\begin{array}{rrrrr}-3&6&-1&1&-7\\1&-2&2&3&-1\\2&-4&5&8&-4\end{array}\right).\nonumber \]
By judiciously permuting rows to simplify the arithmetic, one pathway to construct \(\text{rref}(A)\) is
\[\begin{array}{l}\left(\begin{array}{rrrrr}-3&6&-1&1&-7\\1&-2&2&3&-1\\2&-4&5&8&-4\end{array}\right)\to\left(\begin{array}{rrrrr}1&-2&2&3&-1\\-3&6&-1&1&-7\\2&-4&5&8&-4\end{array}\right)\to \\ \left(\begin{array}{rrrrr}1&-2&2&3&-1\\0&0&5&10&-10\\0&0&1&2&-2\end{array}\right)\to\left(\begin{array}{rrrrr}1&-2&2&3&-1\\0&0&1&2&-2\\0&0&5&10&-10\end{array}\right)\to \\ \left(\begin{array}{rrrrr}1&-2&0&-1&3\\0&0&1&2&-2\\0&0&0&0&0\end{array}\right).\end{array}\nonumber \]
We can now write the matrix equation \(\text{Ax} = 0\) for the null space using \(\text{rref}(\text{A})\). Writing the variable associated with the pivot columns on the left-hand-side of the equations, we have from the first and second rows
\[\begin{aligned}x_1&=2x_2+x_4-3x_5 \\ x_3&=-2x_4+2x_5.\end{aligned} \nonumber \]
Eliminating \(x_1\) and \(x_3\), we now write the general solution for vectors in the null space as
\[\left(\begin{array}{c}2x_2+x_4-3x_5 \\ x_2\\-2x_4&2x_5\\x_4\\x_5\end{array}\right)=x_2\left(\begin{array}{c}2\\1\\0\\0\\0\end{array}\right)+x_4\left(\begin{array}{r}1\\0\\-2\\1\\0\end{array}\right)+x_5\left(\begin{array}{r}-3\\0\\2\\0\\1\end{array}\right),\nonumber \]
where \(x_2\), \(x_4\), and \(x_5\) are called free variables, and can take any values.
The vector multiplying the free variable \(x_2\) has a one in the second row and all the other vectors have a zero in this row. Similarly, the vector multiplying the free variable \(x_4\) has a one in the fourth row and all the other vectors have a zero in this row, and so on. Therefore, these three vectors must be linearly independent and form a basis for the null space. The basis is given by
\[\left\{\left(\begin{array}{c}2\\1\\0\\0\\0\end{array}\right),\quad\left(\begin{array}{r}1\\0\\-2\\1\\0\end{array}\right),\quad\left(\begin{array}{r}-3\\0\\2\\0\\1\end{array}\right)\right\}.\nonumber \]
The null space is seen to be a three-dimensional subspace of \(\mathbb{R}^5\), and its dimension is equal to the number of free variables of \(\text{rref}(\text{A})\). The number of free variables is, of course, equal to the number of columns minus the number of pivot columns.
3.5.2 Application of the Null Space
View Application of the Null Space on YouTube
An underdetermined system of linear equations \(\text{Ax} = \text{b}\) with more unknowns than equations may not have a unique solution. If \(\text{u}\) is the general form of a vector in the null space of \(\text{A}\), and \(\text{v}\) is any vector that satisfies \(\text{Av} = \text{b}\), then \(\text{x} = \text{u} + \text{v}\) satisfies \(\text{Ax} = \text{A}(\text{u} + \text{v}) = \text{Au} + \text{Av} = 0 + b = b\). The general solution of \(\text{Ax} = \text{b}\) can therefore be written as the sum of a general vector in \(\text{Null}(\text{A})\) and a particular vector that satisfies the underdetermined system.
As an example, suppose we want to find the general solution to the linear system of two equations and three unknowns given by
\[\begin{aligned}2x_1+2x_2+x_3&=0,\\2x_1-2x_2-x_3&=1,\end{aligned} \nonumber \]
which in matrix form is given by
\[\left(\begin{array}{rrr}2&2&1\\2&-2&-1\end{array}\right)\left(\begin{array}{c}x_1\\x_2\\x_3\end{array}\right)=\left(\begin{array}{c}0\\1\end{array}\right).\nonumber \]
We first bring the augmented matrix to reduced row echelon form:
\[\left(\begin{array}{rrrr}2&2&1&0\\2&-2&-1&1\end{array}\right)\to\left(\begin{array}{rrrr}1&0&0&1/4\\0&1&1/2&-1/4\end{array}\right).\nonumber \]
The null space is determined from \(x_1 = 0\) and \(x_2 = −x_3/2\), and we can write
\[\text{Null}(\text{A})=\text{span}\left\{\left(\begin{array}{r}0\\-1\\2\end{array}\right)\right\}.\nonumber \]
A particular solution for the inhomogeneous system is found by solving \(x_1 = 1/4\) and \(x_2 + x_3/2 = −1/4\). Here, we simply take the free variable \(x_3\) to be zero, and we find \(x_1 = 1/4\) and \(x_2 = −1/4\). The general solution to the original underdetermined linear system is the sum of the null space and the particular solution and is given by
\[\left(\begin{array}{c}x_1\\x_2\\x_3\end{array}\right)=a\left(\begin{array}{r}0\\-1\\2\end{array}\right)+\frac{1}{4}\left(\begin{array}{r}1\\-1\\0\end{array}\right).\nonumber \]
3.5.3 Column Space
The column space of a matrix is the vector space spanned by the columns of the matrix. When a matrix is multiplied by a column vector, the resulting vector is in the column space of the matrix, as can be seen from the example
\[\left(\begin{array}{cc}a&b\\c&d\end{array}\right)\left(\begin{array}{c}x\\y\end{array}\right)=\left(\begin{array}{c}ax+by\\cx+dy\end{array}\right)=x\left(\begin{array}{c}a\\c\end{array}\right)+y\left(\begin{array}{c}b\\d\end{array}\right).\nonumber \]
In general, \(\text{Ax}\) is a linear combination of the columns of \(\text{A}\), and the equation \(\text{Ax} = 0\) expresses the linear dependence of the columns of \(\text{A}\).
Given an \(m\)-by-\(n\) matrix \(\text{A}\), what is the dimension of the column space of \(\text{A}\), and how do we find a basis? Note that since \(\text{A}\) has \(m\) rows, the column space of \(\text{A}\) is a subspace of \(\mathbb{R}^m\).
Fortunately, a basis for the column space of \(\text{A}\) can be found from \(\text{rref}(\text{A}\)). Consider the example of §3.5.1 , where
\[\text{A}=\left(\begin{array}{rrrrr}-3&6&-1&1&-7\\1&-2&2&3&-1\\2&-4&5&8&-4\end{array}\right),\nonumber \]
and
\[\text{rref}(\text{A})=\left(\begin{array}{rrrrr}1&-2&0&-1&3\\0&0&1&2&-2\\0&0&0&0&0\end{array}\right).\nonumber \]
The matrix equation \(\text{Ax} = 0\) is equivalent to \(\text{rref}(\text{A})\text{x} = 0\), and the latter equation can be expressed as
\[x_1\left(\begin{array}{c}1\\0\\0\end{array}\right)+x_2\left(\begin{array}{r}-2\\0\\0\end{array}\right)+x_3\left(\begin{array}{c}0\\1\\0\end{array}\right)+x_4\left(\begin{array}{r}-1\\2\\0\end{array}\right)+x_5\left(\begin{array}{r}3\\-2\\0\end{array}\right)=\left(\begin{array}{c}0\\0\\0\end{array}\right).\nonumber \]
Only the pivot columns of \(\text{rref}(\text{A})\), here the first and third columns, are linearly independent. For example, the second column is \(−2\) times the first column; and whatever linear dependence relations hold for \(\text{rref}(\text{A})\) hold true for the original matrix \(\text{A}\). (You can try and check this fact.) The dimension of the column space of \(\text{A}\) is therefore equal to the number of pivot columns of \(\text{A}\), and here it is two. A basis for the column space is given by the first and third columns of \(\text{A}\) (not \(\text{rref}(\text{A})\)), and is
\[\left\{\left(\begin{array}{r}-3\\1\\2\end{array}\right),\quad\left(\begin{array}{r}-1\\2\\5\end{array}\right)\right\}.\nonumber \]
Recall that the dimension of the null space is the number of non-pivot columns, so that the sum of the dimensions of the null space and the column space is equal to the total number of columns. A statement of this theorem is as follows. Let \(\text{A}\) be an \(m\)-by-\(n\) matrix. Then
\[\text{dim}(\text{Col}(\text{A}))+\text{dim}(\text{Null}(\text{A}))=n.\nonumber \]
3.5.4 Row Space, Left Null Space and Rank
View Row Space, Left Null Space and Rank on YouTube
In addition to the column space and the null space, a matrix \(\text{A}\) has two more vector spaces associated with it, namely the column space and null space of \(\text{A}^{\text{T}}\), which are called the row space and the left null space of \(\text{A}\).
If \(\text{A}\) is an \(m\)-by-\(n\) matrix, then the row space and the null space are subspaces of \(\mathbb{R}^n\), and the column space and the left null space are subspaces of \(\mathbb{R}^m\).
The null space consists of all vectors \(\text{x}\) such that \(\text{Ax} = 0\), that is, the null space is the set of all vectors that are orthogonal to the row space of \(\text{A}\). We say that these two vector spaces are orthogonal.
A basis for the row space of a matrix can be found from computing \(\text{rref}(\text{A})\), and is found to be rows of \(\text{rref}(\text{A})\) (written as column vectors) with pivot columns. The dimension of the row space of \(\text{A}\) is therefore equal to the number of pivot columns, while the dimension of the null space of \(\text{A}\) is equal to the number of nonpivot columns. The union of these two subspaces make up the vector space of all \(n\)-by-one matrices and we say that these subspaces are orthogonal complements of each other.
Furthermore, the dimension of the column space of A is also equal to the number of pivot columns, so that the dimensions of the column space and the row space of a matrix are equal. We have
\[\text{dim}(\text{Col}(\text{A}))=\text{dim}(\text{Row}(\text{A})).\nonumber \]
We call this dimension the rank of the matrix \(\text{A}\). This is an amazing result since the column space and row space are subspaces of two different vector spaces. In general, we must have \(\text{rank}(\text{A}) ≤ \text{min}(m, n)\). When the equality holds, we say that the matrix is of full rank. And when \(\text{A}\) is a square matrix and of full rank, then the dimension of the null space is zero and \(\text{A}\) is invertible.
We summarize our results in the table below. The null space of \(\text{A}^{\text{T}}\) is also called the left null space of \(\text{A}\) and the column space of \(\text{A}^{\text{T}}\) is also called the row space of \(\text{A}\). The null space of \(\text{A}\) and the row space of \(\text{A}\) are orthogonal complements as is the left null space of \(\text{A}\) and the column space of \(\text{A}\). The dimension of the column space of \(\text{A}\) is equal to the dimension of the row space of \(\text{A}\) and this dimension is called the rank of \(\text{A}\).
Table \(\PageIndex{1}\): The four fundamental subspaces of an \(m\)-by-\(n\) matrix
| vector space | subspace of | dimension |
|---|---|---|
| \(\text{Null}(\text{A})\) | \(\mathbb{R}^n\) | \(n\) - # of pivot columns |
| \(\text{Col}(\text{A})\) | \(\mathbb{R}^m\) | # of pivot columns |
| \(\text{Null}(\text{A}^{\text{T}})\) | \(\mathbb{R}^m\) | \(m\) - # of pivot columns |
| \(\text{Col}(\text{A}^{\text{T}})\) | \(\mathbb{R}^n\) | # of pivot columns |