Skip to main content
Mathematics LibreTexts

13.4: Review Problems

  • Page ID
    2084
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    1. Let \(P_{n}(t)\) be the vector space of polynomials of degree \(n\) or less, and \(\frac{d}{dt} \colon P_{n}(t) \to P_{n}(t)\) be the derivative operator. Find the matrix of \(\frac{d}{dt}\) in the ordered bases \(E=(1,t,\ldots,t^{n} )\) for \(P_{n}(t)\) and \(F=(1,t,\ldots,t^{n} )\) for \(P_{n}(t)\). Determine if this derivative operator is diagonalizable. \(\textit{Recall from chapter 6 that the derivative operator is linear.

    2. When writing a matrix for a linear transformation, we have seen that the choice of basis matters. In fact, even the order of the basis matters!

    1. Write all possible reorderings of the standard basis \((e_{1},e_{2},e_{3})\) for \(\Re^{3}\).
    2. Write each change of basis matrix between the standard basis and each of its reorderings. Make as many observations as you can about these matrices: what are their entries? Do you notice anything about how many of each type of entry appears in each row and column? What are their determinants? (Note: These matrices are known as \(\textit{permutation matrices}\).)
    3. Given \(L:\Re^{3}\to \Re^{3}\) is linear and \[L\begin{pmatrix}x\\y\\z\end{pmatrix}=\begin{pmatrix}2y-z\\3x\\2z+x+y\end{pmatrix}\]

    write the matrix \(M\) for \(L\) in the standard basis, and two reorderings of the standard basis. How are these matrices related?

    3. Let $$X=\{\heartsuit,\clubsuit,\spadesuit\}\, ,\quad Y=\{*,\star\}\, .$$ Write down two different ordered bases, \(S,S'\) and \(T,T'\) respectively, for each of the vector spaces \(\mathbb{R}^{X}\) and \(\mathbb{R}^{Y}\). Find the change of basis matrices \(P\) and \(Q\) that map these bases to one another. Now consider the map
    $$
    \ell:Y\to X\, ,
    $$
    where \(\ell(*)=\heartsuit\) and \(\ell(\star)=\spadesuit\). Show that \(\ell\) can be used to define a linear transformation \(L:\mathbb{R}^{X}\to\mathbb{R}^{Y}\). Compute the matrices \(M\) and \(M'\) of \(L\) in the bases \(S,T\) and then \(S',T'\). Use your change of basis matrices \(P\) and \(Q\) to check that \(M'=Q^{-1}MP\).

    4. Recall that \(tr MN = tr NM\). Use this fact to show that the trace of a square matrix \(M\) does not depend not the basis you used to compute \(M\).

    5. When is the \(2\times 2\) matrix \(\begin{pmatrix}a & b \\c & d\end{pmatrix}\) diagonalizable? Include examples in your answer.

    6. Show that similarity of matrices is an \(\textit{equivalence relation}\).

    7. \(\textit{Jordan form}\)
    a) Can the matrix \(\begin{pmatrix}
    \lambda & 1 \\
    0 & \lambda \\
    \end{pmatrix}\) be diagonalized? Either diagonalize it or explain why this is impossible.

    b) Can the matrix \(\begin{pmatrix}
    \lambda & 1 & 0 \\
    0 & \lambda & 1 \\
    0 & 0 & \lambda \\
    \end{pmatrix}\) be diagonalized? Either diagonalize it or explain why this is impossible.

    c) Can the \(n \times n\) matrix \(\begin{pmatrix}
    \lambda & 1 & 0 & \cdots & 0 & 0 \\
    0 & \lambda & 1 & \cdots & 0 & 0 \\
    0 & 0 & \lambda & \cdots & 0 & 0 \\
    \vdots & \vdots & \vdots & \ddots & \vdots & \vdots \\
    0 & 0 & 0 & \cdots & \lambda & 1 \\
    0 & & 0 & \cdots & 0 & \lambda \\
    \end{pmatrix}\) be diagonalized? Either diagonalize it or explain why this is impossible.

    \(\textit{Note:}\) It turns out that every matrix is similar to a block matrix whose diagonal blocks look like diagonal matrices or the ones above and whose off-diagonal blocks are all zero. This is called the \(\textit{Jordan form}\) of the matrix and a (maximal) block that looks like
    \[
    \left(
    \begin{array}{ccccc}
    \lambda & 1 & 0&\cdots & 0 \\
    0 & \lambda & 1 & & 0 \\
    \vdots & &\ddots &\ddots & \\
    &&&\lambda&1\\
    0 &0 && 0 & \lambda
    \end{array}\right)
    \]
    is called a \(\textit{Jordan \(n\)-cell}\) or a \(\textit{Jordan block}\) where \(n\) is the size of the block.

    8. Let \(A\) and \(B\) be commuting matrices (\(\textit{i.e.}\), \(AB = BA\)) and suppose that \(A\) has an eigenvector \(v\) with eigenvalue \(\lambda\). Show that \(Bv\) is also an eigenvector of \(A\) with eigenvalue \(\lambda\). Additionally suppose that \(A\) is diagonalizable with distinct eigenvalues. What is the dimension of each eigenspace of \(A\)? Show that \(v\) is also an eigenvector of \(B\). Explain why this shows that \(A\) and \(B\) can be \(\textit{simultaneously diagonalized}\) (\(\textit{i.e.}\) there is an ordered basis in which both their matrices are diagonal.)

    Contributor


    This page titled 13.4: Review Problems is shared under a not declared license and was authored, remixed, and/or curated by David Cherney, Tom Denton, & Andrew Waldron.