Skip to main content
\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)
Mathematics LibreTexts

13.2: Change of Basis

  • Page ID
    2082
  •  

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    Suppose we have two ordered bases \(S=(v_{1}, \ldots, v_{n} )\) and \(S'=(v'_{1}, \ldots, v'_{n} )\) for a vector space \(V\). (Here \(v_{i}\) and \(v'_{i}\) are \(\textit{vectors}\), not components of vectors in a basis!) Then we may write each \(v'_{i}\) uniquely as a linear combination of the \(v_{j}\):

    \[v'_{j} = \sum_{i} v_{i}p^{i}_{j}\, ,\]

    or in matrix notation

    $$\begin{pmatrix}v'_{1} , v'_{2} , \cdots , v'_{n}\end{pmatrix} = \begin{pmatrix}v_{1} , v_{2} , \cdots , v_{n}\end{pmatrix}\begin{pmatrix}p^{1}_{1} & p^{1}_{2} & \cdots & p^{1}_{n} \\ p^{2}_{1} & p^{2}_{2} && \\ \vdots &&& \vdots \\ p^{n}_{1} && \cdots & p^{n}_{n}\end{pmatrix}$$

    Here, the \(p^{i}_{j}\) are constants, which we can regard as entries of a square matrix \(P=(p^{i}_{j})\). The matrix \(P\) must have an inverse, since we can also write each \(v_{i}\) uniquely as a linear combination of the \(v'_{j}\):

    \[v_{j} = \sum_{k} v_{k}q^{k}_{j}.\]

    Then we can write:

    \[v_{j} = \sum_{k} \sum_{i} v'_{k}q^{k}_{i}p^{i}_{j}.\]

    But \(\sum_{i} q^{k}_{i}p^{i}_{j}\) is the \(k,j\) entry of the product matrix \(QP\). Since the expression for \(v_{j}\) in the basis \(S\) is \(v_{j}\) itself, then \(QP\) maps each \(v_{j}\) to itself. As a result, each \(v_{j}\) is an eigenvector for \(QP\) with eigenvalue \(1\), so \(QP\) is the identity, \(\textit{i.e.}\)

    $$PQ=QP=I \leftrightarrow Q=P^{-1}\, .$$

    The matrix \(P\) is called a \(\textit{change of basis}\) matrix. There is a quick and dirty trick to obtain it: Look at the formula above relating the new basis vectors \(v'_{1},v'_{2},\ldots v'_{n}\) to the old ones \(v_{1},v_{2},\ldots,v_{n}\). In particular focus on \(v'_{1}\) for which

    $$
    v'_{1}= \begin{pmatrix}v_{1} , v_{2} , \cdots , v_{n}\end{pmatrix}
    \begin{pmatrix}p^{1}_{1}\\p^{2}_{1}\\\vdots \\ p^{n}_{1}
    \end{pmatrix}\, .
    $$

    This says that the first column of the change of basis matrix \(P\) is really just the components of the vector \(v'_{1}\) in the basis \(v_{1},v_{2},\ldots,v_{n}\), so:

    $$\textit{The columns of the change of basis matrix are the components of the new basis vectors in terms of the old basis vectors.}$$

    Example 120

    Suppose \(S'=(v'_{1},v'_{2})\) is an ordered basis for a vector space \(V\) and that with respect to some other ordered basis \(S=(v_{1}, v_{2})\) for \(V\) 
    $$
    v'_{1}=
    \begin{pmatrix}
    \frac{1}{\sqrt{2}}\\\frac{1}{\sqrt{2}}
    \end{pmatrix} _{S}
    \quad \mbox{and} \quad 
    v'_{2}=
    \begin{pmatrix}
    \frac{1}{\sqrt{3}}\\-\frac{1}{\sqrt{3}}
    \end{pmatrix}_{S} \, .
    $$
    This means 
    $$
    v'_{1}=\begin{pmatrix}v_{1} , v_{2} \end{pmatrix}\begin{pmatrix}
    \frac{1}{\sqrt{2}}\\ \frac{1}{\sqrt{2}}
    \end{pmatrix}
    =\frac{v_{1}+v_{2}}{\sqrt{2}}\quad\mbox{and}\quad
    v'_{2}=\begin{pmatrix}v_{1} , v_{2} \end{pmatrix}\begin{pmatrix}
    \frac{1}{\sqrt{3}}\\-\frac{1}{\sqrt{3}}
    \end{pmatrix}
    =\frac{v_{1}-v_{2}}{\sqrt{3}}\, .
    $$
    The change of basis matrix has as its columns just the components of \(v'_{1}\) and \(v'_{2}\);
    $$
    P= \begin{pmatrix}
    \frac{1}{\sqrt{2}}&\frac{1}{\sqrt{3}}\\
    \frac{1}{\sqrt{2}}&-\frac{1}{\sqrt{3}}
    \end{pmatrix}\, .
    $$

    Changing basis changes the matrix of a linear transformation. However, as a map between vector spaces, \(\textit{the linear transformation is the same no matter which basis we use}\). Linear transformations are the actual objects of study of this book, not matrices; matrices are merely a convenient way of doing computations.

    Lets now calculate how the matrix of a linear transformation changes when changing basis. To wit, let \(L \colon V \longrightarrow W\) with matrix \(M=(m^{i}_{j})\) in the ordered input and output bases \(S=(v_{1}, \ldots, v_{n} )\) and \(T=(w_{1},\ldots,w_{m})\) so

    \[L(v_{i}) = \sum_{k} w_{k}m^{k}_{i}.\]

    Now, suppose \(S'=(v'_{1}, \ldots, v'_{n} )\) and \(T'=(w'_{1},\ldots,w'_{m})\) are new ordered input and out bases with matrix \(M'=({m'}_{i}^{k})\). Then

    \[L(v'_{i})= \sum_{k} w_{k}m'^{k}_{i}\, .\]

    Let \(P=(p^{i}_{j})\) be the change of basis matrix from input basis \(S\) to the basis \(S'\) and \(Q=(q^{j}_{k})\) be the change of basis matrix from output basis \(T\) to the basis \(T'\). Then:

    \[L(v'_{j})=L\left(\sum_{i} v_{i} p^{i}_{j}\right) = \sum_{i} L(v_{i})p^{i}_{j}
    = \sum_{i} \sum_{k} w_{k} m^{k}_{i} p^{i}_{j}.\]

    Meanwhile, we have:

    \[L(v'_{i}) = \sum_{k}v_{k}m'^{k}_{i} = \sum_{k} \sum_{j} v_{j} q^{j}_{k}m^{k}_{i}.\]

    Since the expression for a vector in a basis is unique, then we see that the entries of \(MP\) are the same as the entries of \(QM'\). In other words, we see that \(MP = QM' \qquad \textit{or}\qquad M'=Q^{-1}MP.\)

    Example 121

    Let \(V\) be the space of polynomials in \(t\) and degree 2 or less and \(L:V\to \mathbb{R}^{2}\) where

    $$L(1)=\begin{pmatrix}1\\2\end{pmatrix}\, \quad L(t)=\begin{pmatrix}2\\1\end{pmatrix}\, ,\quad L(t^{2})=\begin{pmatrix}3\\3\end{pmatrix}\, .$$

    From this information we can immediately read off the matrix \(M\) of \(L\) in the bases \(S=(1,t,t^{2})\) and \(T=(e_{1},e_{2})\), the standard basis for \(\mathbb{R}^{2}\),
    because

    \begin{eqnarray*}\big(L(1),L(t),L(t^{2})\big)&=&(e_{1}+2 e_{2},2e_{1}+e_{2}, 3 e_{1}+3e_{2})\\&=&(e_{1},e_{2})\begin{pmatrix}1&2&3\\2&1&3\end{pmatrix}\, \Rightarrow \, M\ =\ 
    \begin{pmatrix}1&2&3\\2&1&3\end{pmatrix}\, .\end{eqnarray*}

    Now suppose we are more interested in the bases

    $$S'=(1+t,t+t^{2},1+t^{2})\, , \quad T'=\left(\begin{pmatrix}1\\2\end{pmatrix},\begin{pmatrix}2\\1\end{pmatrix}\right)=:(w_{1}',w_{2}')\, .$$

    To compute the new matrix \(M'\) of \(L\) we could simply calculate what \(L\) does the the new input basis vectors in terms of the new output basis vectors:

    \begin{eqnarray*}
    \big(L(1+t)L(t+t^{2}),L(1+t^{2}))&=&\left(\begin{pmatrix}1\\2\end{pmatrix}+\begin{pmatrix}2\\1\end{pmatrix},
    \begin{pmatrix}2\\1\end{pmatrix}+\begin{pmatrix}3\\3\end{pmatrix},\begin{pmatrix}1\\2\end{pmatrix}+\begin{pmatrix}3\\3\end{pmatrix}
    \right)\\
    &=&(w_{1}+w_{2},w_{1}+2w_{2},2w_{2}+w_{1})\\&=&(w_{1},w_{2})\begin{pmatrix}1&1&2\\1&2&1\end{pmatrix}\, \Rightarrow \, M'=\begin{pmatrix}1&1&2\\1&2&1\end{pmatrix}\, .
    \end{eqnarray*}
    Alternatively we could calculate the change of basis matrices \(P\) and \(Q\) by noting that

    $$(1+t,t+t^{2},1+t^{2})=(1,t,t^{2})\begin{pmatrix}1&0&1\\1&1&0\\0&1&1\end{pmatrix}\, \Rightarrow\, P=\begin{pmatrix}1&0&1\\1&1&0\\0&1&1\end{pmatrix}$$

    and

    $$(w_{1},w_{2})=(e_{1}+2e_{2},2e_{1}+e_{2})=(e_{1},e_{1})\begin{pmatrix}1&2\\2&1\end{pmatrix}\, \Rightarrow\, Q=\begin{pmatrix}1&2\\2&1\end{pmatrix}\, .$$

    Hence

    $$M'=Q^{-1}MP = -\frac{1}{3}\begin{pmatrix}1&-2\\-2&1\end{pmatrix}\begin{pmatrix}1&2&3\\2&1&3\end{pmatrix}
    \begin{pmatrix}1&0&1\\1&1&0\\0&1&1\end{pmatrix}=\begin{pmatrix}1&1&2\\1&2&1\end{pmatrix}\, .$$

    Notice that the change of basis matrices \(P\) and \(Q\) are both square and invertible. Also, since we really wanted \(Q^{-1}\), it is more efficient to try and write \((e_{1},e_{2})\) in terms of \((w_{1},w_{2})\) which would yield directly \(Q^{-1}\). Alternatively, one can check that \(MP=QM'\).

    Contributor