$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

# 5.2: The Matrix of a Linear Transformation I

$$\newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

Learning Objectives

1. Find the matrix of a linear transformation with respect to the standard basis.
2. Determine the action of a linear transformation on a vector in $$\mathbb{R}^n$$.

In the above examples, the action of the linear transformations was to multiply by a matrix. It turns out that this is always the case for linear transformations. If $$T$$ is any linear transformation which maps $$\mathbb{R}^{n}$$ to $$\mathbb{R}^{m},$$ there is always an $$m\times n$$ matrix $$A$$ with the property that $T\left(\vec{x}\right) = A\vec{x} \label{matrixoftransf}$ for all $$\vec{x} \in \mathbb{R}^{n}$$.

Theorem $$\PageIndex{1}$$: Matrix of a Linear Transformation

Let $$T:\mathbb{R}^{n}\mapsto \mathbb{R}^{m}$$ be a linear transformation. Then we can find a matrix $$A$$ such that $$T(\vec{x}) = A\vec{x}$$. In this case, we say that $$T$$ is determined or induced by the matrix $$A$$.

Here is why. Suppose $$T:\mathbb{R}^{n}\mapsto \mathbb{R}^{m}$$ is a linear transformation and you want to find the matrix defined by this linear transformation as described in [matrixoftransf]. Note that $\vec{x} =\bigg( \begin{array}{c} x_{1} \\ x_{2} \\ \vdots \\ x_{n} \end{array} \bigg) = x_{1}\bigg( \begin{array}{c} 1 \\ 0 \\ \vdots \\ 0 \end{array} \bigg) + x_{2}\bigg( \begin{array}{c} 0 \\ 1 \\ \vdots \\ 0 \end{array} \bigg) +\cdots + x_{n}\bigg( \begin{array}{c} 0 \\ 0 \\ \vdots \\ 1 \end{array} \bigg) = \sum_{i=1}^{n}x_{i}\vec{e}_{i}$ where $$\vec{e}_{i}$$ is the $$i^{th}$$ column of $$I_n$$, that is the $$n \times 1$$ vector which has zeros in every slot but the $$i^{th}$$ and a 1 in this slot.

Then since $$T$$ is linear, \begin{aligned} T\left( \vec{x} \right)&=&\sum_{i=1}^{n}x_{i}T\left( \vec{e}_{i}\right) \\ &=&\bigg( \begin{array}{ccc} | & & | \\ T\left( \vec{e}_{1}\right) & \cdots & T\left( \vec{e}_{n}\right) \\ | & & | \end{array} \bigg) \bigg( \begin{array}{c} x_{1} \\ \vdots \\ x_{n} \end{array} \bigg) \\ &=& A\bigg( \begin{array}{c} x_{1} \\ \vdots \\ x_{n} \end{array} \bigg)\end{aligned} The desired matrix is obtained from constructing the $$i^{th}$$ column as $$T\left( \vec{e}_{i}\right) .$$ Recall that the set $$\left\{ \vec{e}_1, \vec{e}_2, \cdots, \vec{e}_n \right\}$$ is called the standard basis of $$\mathbb{R}^n$$. Therefore the matrix of $$T$$ is found by applying $$T$$ to the standard basis. We state this formally as the following theorem.

Theorem $$\PageIndex{2}$$: Matrix of a Linear Transformation

Let $$T: \mathbb{R}^{n} \mapsto \mathbb{R}^{m}$$ be a linear transformation. Then the matrix $$A$$ satisfying $$T\left(\vec{x}\right)=A\vec{x}$$ is given by $A= \bigg( \begin{array}{ccc} | & & | \\ T\left( \vec{e}_{1}\right) & \cdots & T\left( \vec{e}_{n}\right) \\ | & & | \end{array} \bigg)$ where $$\vec{e}_{i}$$ is the $$i^{th}$$ column of $$I_n$$, and then $$T\left( \vec{e}_{i} \right)$$ is the $$i^{th}$$ column of $$A$$.

The following Corollary is an essential result.

Corollary $$\PageIndex{1}$$: Matrix and Linear Transformation

A transformation $$T:\mathbb{R}^n\rightarrow \mathbb{R}^m$$ is a linear transformation if and only if it is a matrix transformation.

Consider the following example.

Example $$\PageIndex{1}$$: The Matrix of a Linear Transformation

Suppose $$T$$ is a linear transformation, $$T:\mathbb{R}^{3}\rightarrow \mathbb{ R}^{2}$$ where $T\bigg( \begin{array}{r} 1 \\ 0 \\ 0 \end{array} \bigg) =\bigg( \begin{array}{r} 1 \\ 2 \end{array} \bigg) ,\ T\bigg( \begin{array}{r} 0 \\ 1 \\ 0 \end{array} \bigg) =\bigg( \begin{array}{r} 9 \\ -3 \end{array} \bigg) ,\ T\bigg( \begin{array}{r} 0 \\ 0 \\ 1 \end{array} \bigg) =\bigg( \begin{array}{r} 1 \\ 1 \end{array} \bigg)$ Find the matrix $$A$$ of $$T$$ such that $$T \left( \vec{x} \right)=A\vec{x}$$ for all $$\vec{x}$$.

Solution

By Theorem [thm:matrixoflineartransformation] we construct $$A$$ as follows: $A = \bigg( \begin{array}{ccc} | & & | \\ T\left( \vec{e}_{1}\right) & \cdots & T\left( \vec{e}_{n}\right) \\ | & & | \end{array} \bigg)$

In this case, $$A$$ will be a $$2 \times 3$$ matrix, so we need to find $$T \left(\vec{e}_1 \right), T \left(\vec{e}_2 \right),$$ and $$T \left(\vec{e}_3 \right)$$. Luckily, we have been given these values so we can fill in $$A$$ as needed, using these vectors as the columns of $$A$$. Hence, $A=\bigg( \begin{array}{rrr} 1 & 9 & 1 \\ 2 & -3 & 1 \end{array} \bigg)$

In this example, we were given the resulting vectors of $$T \left(\vec{e}_1 \right), T \left(\vec{e}_2 \right),$$ and $$T \left(\vec{e}_3 \right)$$. Constructing the matrix $$A$$ was simple, as we could simply use these vectors as the columns of $$A$$. The next example shows how to find $$A$$ when we are not given the $$T \left(\vec{e}_i \right)$$ so clearly.

Example $$\PageIndex{2}$$: The Matrix of Linear Transformation: Inconveniently
Defined

Suppose $$T$$ is a linear transformation, $$T:\mathbb{R}^{2}\rightarrow \mathbb{R}^{2}$$ and $T\bigg( \begin{array}{r} 1 \\ 1 \end{array} \bigg) =\bigg( \begin{array}{r} 1 \\ 2 \end{array} \bigg) ,\ T\bigg( \begin{array}{r} 0 \\ -1 \end{array} \bigg) =\bigg( \begin{array}{r} 3 \\ 2 \end{array} \bigg)$ Find the matrix $$A$$ of $$T$$ such that $$T \left( \vec{x} \right)=A\vec{x}$$ for all $$\vec{x}$$.

Solution

By Theorem [thm:matrixoflineartransformation] to find this matrix, we need to determine the action of $$T$$ on $$\vec{e}_{1}$$ and $$\vec{e}_{2}$$. In Example [exa:matrixoflineartransformation], we were given these resulting vectors. However, in this example, we have been given $$T$$ of two different vectors. How can we find out the action of $$T$$ on $$\vec{e}_{1}$$ and $$\vec{e}_{2}$$? In particular for $$\vec{e}_{1}$$, suppose there exist $$x$$ and $$y$$ such that $\bigg( \begin{array}{r} 1 \\ 0 \end{array} \bigg) = x\bigg( \begin{array}{r} 1\\ 1 \end{array} \bigg) +y\bigg( \begin{array}{r} 0 \\ -1 \end{array} \bigg) \label{matrixvalues}$

Then, since $$T$$ is linear, $T\bigg( \begin{array}{r} 1 \\ 0 \end{array} \bigg) = x T\bigg( \begin{array}{r} 1 \\ 1 \end{array} \bigg) +y T\bigg( \begin{array}{r} 0 \\ -1 \end{array} \bigg)$

Substituting in values, this sum becomes $T\bigg( \begin{array}{r} 1 \\ 0 \end{array} \bigg) = x\bigg( \begin{array}{r} 1 \\ 2 \end{array} \bigg) +y\bigg( \begin{array}{r} 3 \\ 2 \end{array} \bigg) \label{matrixvalues2}$

Therefore, if we know the values of $$x$$ and $$y$$ which satisfy [matrixvalues], we can substitute these into equation [matrixvalues2]. By doing so, we find $$T\left(\vec{e}_1\right)$$ which is the first column of the matrix $$A$$.

We proceed to find $$x$$ and $$y$$. We do so by solving [matrixvalues], which can be done by solving the system $\begin{array}{c} x = 1 \\ x - y = 0 \end{array}$

We see that $$x=1$$ and $$y=1$$ is the solution to this system. Substituting these values into equation [matrixvalues2], we have $T\bigg( \begin{array}{r} 1 \\ 0 \end{array} \bigg) = 1 \bigg( \begin{array}{r} 1 \\ 2 \end{array} \bigg) + 1 \bigg( \begin{array}{r} 3 \\ 2 \end{array} \bigg) = \bigg( \begin{array}{r} 1 \\ 2 \end{array} \bigg) + \bigg( \begin{array}{r} 3 \\ 2 \end{array} \bigg) = \bigg( \begin{array}{r} 4 \\ 4 \end{array} \bigg)$

Therefore $$\bigg( \begin{array}{r} 4 \\ 4 \end{array} \bigg)$$ is the first column of $$A$$.

Computing the second column is done in the same way, and is left as an exercise.

The resulting matrix $$A$$ is given by $A = \bigg( \begin{array}{rr} 4 & -3 \\ 4 & -2 \end{array} \bigg)$

This example illustrates a very long procedure for finding the matrix of $$A$$. While this method is reliable and will always result in the correct matrix $$A$$, the following procedure provides an alternative method.

Procedure $$\PageIndex{1}$$: Finding the Matrix of Inconveniently Defined Linear Transformation

Suppose $$T:\mathbb{R}^{n}\rightarrow \mathbb{R}^{m}$$ is a linear transformation. Suppose there exist vectors $$\left\{ \vec{a}_{1},\cdots ,\vec{a}_{n}\right\}$$ in $$\mathbb {R}^{n}$$ such that $$\bigg( \begin{array}{ccc} \vec{a}_{1} & \cdots & \vec{a}_{n} \end{array} \bigg) ^{-1}$$ exists, and $T \left(\vec{a}_{i}\right)=\vec{b}_{i}$ Then the matrix of $$T$$ must be of the form $\bigg( \begin{array}{ccc} \vec{b}_{1} & \cdots & \vec{b}_{n} \end{array} \bigg) \bigg( \begin{array}{ccc} \vec{a}_{1} & \cdots & \vec{a}_{n} \end{array} \bigg) ^{-1}$

We will illustrate this procedure in the following example. You may also find it useful to work through Example [exa:2x2inconvenientmatrixoflintransf] using this procedure.

Example $$\PageIndex{3}$$: Matrix of a Linear Transformation
Given
Inconveniently

Suppose $$T:\mathbb{R}^{3}\rightarrow \mathbb{R}^{3}$$ is a linear transformation and $T\bigg( \begin{array}{r} 1 \\ 3 \\ 1 \end{array} \bigg) =\bigg( \begin{array}{r} 0 \\ 1 \\ 1 \end{array} \bigg) ,T\bigg( \begin{array}{r} 0 \\ 1 \\ 1 \end{array} \bigg) =\bigg( \begin{array}{r} 2 \\ 1 \\ 3 \end{array} \bigg) ,T\bigg( \begin{array}{r} 1 \\ 1 \\ 0 \end{array} \bigg) =\bigg( \begin{array}{r} 0 \\ 0 \\ 1 \end{array} \bigg)$ Find the matrix of this linear transformation.

Solution

By Procedure [proc:findingmatrixoflineartransformation], $$A= \bigg( \begin{array}{rrr} 1 & 0 & 1 \\ 3 & 1 & 1 \\ 1 & 1 & 0 \end{array} \bigg) ^{-1}$$ and $$B=\bigg( \begin{array}{rrr} 0 & 2 & 0 \\ 1 & 1 & 0 \\ 1 & 3 & 1 \end{array} \bigg)$$

Then, Procedure [proc:findingmatrixoflineartransformation] claims that the matrix of $$T$$ is $C= BA^{-1} =\bigg( \begin{array}{rrr} 2 & -2 & 4 \\ 0 & 0 & 1 \\ 4 & -3 & 6 \end{array} \bigg)$

Indeed you can first verify that $$T(\vec{x})=C\vec{x}$$ for the 3 vectors above:

$\bigg( \begin{array}{ccc} 2 & -2 & 4 \\ 0 & 0 & 1 \\ 4 & -3 & 6 \end{array} \bigg) \bigg( \begin{array}{c} 1 \\ 3 \\ 1 \end{array} \bigg) =\bigg( \begin{array}{c} 0 \\ 1 \\ 1 \end{array} \bigg) ,\ \bigg( \begin{array}{ccc} 2 & -2 & 4 \\ 0 & 0 & 1 \\ 4 & -3 & 6 \end{array} \bigg) \bigg( \begin{array}{c} 0 \\ 1 \\ 1 \end{array} \bigg) =\bigg( \begin{array}{c} 2 \\ 1 \\ 3 \end{array} \bigg)$ $\bigg( \begin{array}{ccc} 2 & -2 & 4 \\ 0 & 0 & 1 \\ 4 & -3 & 6 \end{array} \bigg) \bigg( \begin{array}{c} 1 \\ 1 \\ 0 \end{array} \bigg) =\bigg( \begin{array}{c} 0 \\ 0 \\ 1 \end{array} \bigg)$

But more generally $$T(\vec{x})= C\vec{x}$$ for any $$\vec{x}$$. To see this, let $$\vec{y}=A^{-1}\vec{x}$$ and then using linearity of $$T$$: $T(\vec{x})= T(A\vec{y}) = T \left( \sum_i \vec{y}_i\vec{a}_i \right) = \sum \vec{y}_i T(\vec{a}_i) \sum \vec{y}_i \vec{b}_i = B\vec{y} = BA^{-1}\vec{x} = C\vec{x}$

Recall the dot product discussed earlier. Consider the map $$\vec{v}$$$$\mapsto$$ $$\mathrm{proj}_{\vec{u}}\left( \vec{v}\right)$$ which takes a vector a transforms it to its projection onto a given vector $$\vec{u}$$. It turns out that this map is linear, a result which follows from the properties of the dot product. This is shown as follows. \begin{aligned} \mathrm{proj}_{\vec{u}}\left( k \vec{v}+ p \vec{w}\right) &=&\left( \frac{(k \vec{v}+ p \vec{w})\cdot \vec{u}}{ \vec{u}\cdot \vec{u}}\right) \vec{u} \\ &=& k \left( \frac{ \vec{v}\cdot \vec{u}}{\vec{u}\cdot \vec{u}}\right) \vec{u}+p \left( { 0.05in}\frac{\vec{w}\cdot \vec{u}}{\vec{u}\cdot \vec{u}}\right) \vec{u} \\ &=& k \; \mathrm{proj}_{\vec{u}}\left( \vec{v}\right) +p \; \mathrm{proj} _{\vec{u}}\left( \vec{w}\right) \end{aligned}

Consider the following example.

Example $$\PageIndex{4}$$: Matrix of a Projection Map

Let $$\vec{u} = \bigg( \begin{array}{r} 1 \\ 2 \\ 3 \end{array} \bigg)$$ and let $$T$$ be the projection map $$T: \mathbb{R}^3 \mapsto \mathbb{R}^3$$ defined by $T(\vec{v}) = \mathrm{proj}_{\vec{u}}\left( \vec{v}\right)$ for any $$\vec{v} \in \mathbb{R}^3$$.

1. Does this transformation come from multiplication by a matrix?
2. If so, what is the matrix?

Solution

1. First, we have just seen that $$T (\vec{v}) = \mathrm{proj}_{\vec{u}}\left( \vec{v}\right)$$ is linear. Therefore by Theorem [thm:matrixlintransf], we can find a matrix $$A$$ such that $$T(\vec{x}) = A\vec{x}$$.
2. The columns of the matrix for $$T$$ are defined above as $$T(\vec{e}_{i})$$. It follows that $$T(\vec{e}_{i}) = \mathrm{proj} _{\vec{u}}\left( \vec{e}_{i}\right)$$ gives the $$i^{th}$$ column of the desired matrix. Therefore, we need to find $\mathrm{proj}_{\vec{u}}\left( \vec{e}_{i}\right) = \left( \frac{\vec{e}_{i}\cdot \vec{u}}{\vec{u}\cdot \vec{u}}\right) \vec{u}$ For the given vector $$\vec{u}$$, this implies the columns of the desired matrix are $\frac{1}{14}\bigg( \begin{array}{r} 1 \\ 2 \\ 3 \end{array} \bigg) , \frac{2}{14}\bigg( \begin{array}{r} 1 \\ 2 \\ 3 \end{array} \bigg) , \frac{3}{14}\bigg( \begin{array}{r} 1 \\ 2 \\ 3 \end{array} \bigg)$ which you can verify. Hence the matrix of $$T$$ is $\frac{1}{14}\bigg( \begin{array}{rrr} 1 & 2 & 3 \\ 2 & 4 & 6 \\ 3 & 6 & 9 \end{array} \bigg)$