$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

# 13.4: Review Problems

$$\newcommand{\vecs}{\overset { \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

1. Let $$P_{n}(t)$$ be the vector space of polynomials of degree $$n$$ or less, and $$\frac{d}{dt} \colon P_{n}(t) \to P_{n}(t)$$ be the derivative operator. Find the matrix of $$\frac{d}{dt}$$ in the ordered bases $$E=(1,t,\ldots,t^{n} )$$ for $$P_{n}(t)$$ and $$F=(1,t,\ldots,t^{n} )$$ for $$P_{n}(t)$$. Determine if this derivative operator is diagonalizable. $$\textit{Recall from chapter 6 that the derivative operator is linear. 2. When writing a matrix for a linear transformation, we have seen that the choice of basis matters. In fact, even the order of the basis matters! 1. Write all possible reorderings of the standard basis \((e_{1},e_{2},e_{3})$$ for $$\Re^{3}$$.
2. Write each change of basis matrix between the standard basis and each of its reorderings. Make as many observations as you can about these matrices: what are their entries? Do you notice anything about how many of each type of entry appears in each row and column? What are their determinants? (Note: These matrices are known as $$\textit{permutation matrices}$$.)
3. Given $$L:\Re^{3}\to \Re^{3}$$ is linear and $L\begin{pmatrix}x\\y\\z\end{pmatrix}=\begin{pmatrix}2y-z\\3x\\2z+x+y\end{pmatrix}$

write the matrix $$M$$ for $$L$$ in the standard basis, and two reorderings of the standard basis. How are these matrices related?

3. Let $$X=\{\heartsuit,\clubsuit,\spadesuit\}\, ,\quad Y=\{*,\star\}\, .$$ Write down two different ordered bases, $$S,S'$$ and $$T,T'$$ respectively, for each of the vector spaces $$\mathbb{R}^{X}$$ and $$\mathbb{R}^{Y}$$. Find the change of basis matrices $$P$$ and $$Q$$ that map these bases to one another. Now consider the map
$$\ell:Y\to X\, ,$$
where $$\ell(*)=\heartsuit$$ and $$\ell(\star)=\spadesuit$$. Show that $$\ell$$ can be used to define a linear transformation $$L:\mathbb{R}^{X}\to\mathbb{R}^{Y}$$. Compute the matrices $$M$$ and $$M'$$ of $$L$$ in the bases $$S,T$$ and then $$S',T'$$. Use your change of basis matrices $$P$$ and $$Q$$ to check that $$M'=Q^{-1}MP$$.

4. Recall that $$tr MN = tr NM$$. Use this fact to show that the trace of a square matrix $$M$$ does not depend not the basis you used to compute $$M$$.

5. When is the $$2\times 2$$ matrix $$\begin{pmatrix}a & b \\c & d\end{pmatrix}$$ diagonalizable? Include examples in your answer.

6. Show that similarity of matrices is an $$\textit{equivalence relation}$$.

7. $$\textit{Jordan form}$$
a) Can the matrix $$\begin{pmatrix} \lambda & 1 \\ 0 & \lambda \\ \end{pmatrix}$$ be diagonalized? Either diagonalize it or explain why this is impossible.

b) Can the matrix $$\begin{pmatrix} \lambda & 1 & 0 \\ 0 & \lambda & 1 \\ 0 & 0 & \lambda \\ \end{pmatrix}$$ be diagonalized? Either diagonalize it or explain why this is impossible.

c) Can the $$n \times n$$ matrix $$\begin{pmatrix} \lambda & 1 & 0 & \cdots & 0 & 0 \\ 0 & \lambda & 1 & \cdots & 0 & 0 \\ 0 & 0 & \lambda & \cdots & 0 & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & 0 & \cdots & \lambda & 1 \\ 0 & & 0 & \cdots & 0 & \lambda \\ \end{pmatrix}$$ be diagonalized? Either diagonalize it or explain why this is impossible.

$$\textit{Note:}$$ It turns out that every matrix is similar to a block matrix whose diagonal blocks look like diagonal matrices or the ones above and whose off-diagonal blocks are all zero. This is called the $$\textit{Jordan form}$$ of the matrix and a (maximal) block that looks like
$\left( \begin{array}{ccccc} \lambda & 1 & 0&\cdots & 0 \\ 0 & \lambda & 1 & & 0 \\ \vdots & &\ddots &\ddots & \\ &&&\lambda&1\\ 0 &0 && 0 & \lambda \end{array}\right)$
is called a $$\textit{Jordan \(n$$-cell}\) or a $$\textit{Jordan block}$$ where $$n$$ is the size of the block.

8. Let $$A$$ and $$B$$ be commuting matrices ($$\textit{i.e.}$$, $$AB = BA$$) and suppose that $$A$$ has an eigenvector $$v$$ with eigenvalue $$\lambda$$. Show that $$Bv$$ is also an eigenvector of $$A$$ with eigenvalue $$\lambda$$. Additionally suppose that $$A$$ is diagonalizable with distinct eigenvalues. What is the dimension of each eigenspace of $$A$$? Show that $$v$$ is also an eigenvector of $$B$$. Explain why this shows that $$A$$ and $$B$$ can be $$\textit{simultaneously diagonalized}$$ ($$\textit{i.e.}$$ there is an ordered basis in which both their matrices are diagonal.)