Skip to main content
Mathematics LibreTexts

5.5E: Similarity and Diagonalization Exercises

  • Page ID
    132824
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    Exercises for 1

    solutions

    2

    By computing the trace, determinant, and rank, show that \(A\) and \(B\) are not similar in each case.

    1. \(A = \left[ \begin{array}{rr} 1 & 2 \\ 2 & 1 \end{array} \right]\), \(B = \left[ \begin{array}{rr} 1 & 1\\ -1 & 1 \end{array} \right]\)
    2. \(A = \left[ \begin{array}{rr} 3 & 1 \\ 2 & -1 \end{array} \right]\), \(B = \left[ \begin{array}{rr} 1 & 1 \\ 2 & 1 \end{array} \right]\)
    3. \(A = \left[ \begin{array}{rr} 2 & 1 \\ 1 & -1 \end{array} \right]\), \(B = \left[ \begin{array}{rr} 3 & 0 \\ 1 & -1 \end{array} \right]\)
    4. \(A = \left[ \begin{array}{rr} 3 & 1 \\ -1 & 2 \end{array} \right]\), \(B = \left[ \begin{array}{rr} 2 & -1 \\ 3 & 2 \end{array} \right]\)
    5. \(A = \left[ \begin{array}{rrr} 2 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{array} \right]\), \(B = \left[ \begin{array}{rrr} 1 & -2 & 1 \\ -2 & 4 & -2 \\ -3 & 6 & -3 \end{array} \right]\)
    6. \(A = \left[ \begin{array}{rrr} 1 & 2 & -3 \\ 1 & -1 & 2 \\ 0 & 3 & -5 \end{array} \right]\), \(B = \left[ \begin{array}{rrr} -2 & 1 & 3 \\ 6 & -3 & -9 \\ 0 & 0 & 0 \end{array} \right]\)
    1. traces \(= 2\), ranks \(= 2\), but \(\det A = -5\), \(\det B = -1\)
    2. ranks \(= 2\), determinants \(= 7\), but \(\func{tr} A = 5\), \(\func{tr} B = 4\)
    3. traces \(= -5\), determinants \(= 0\), but \(rank \; A = 2\), \(rank \; B = 1\)

    Show that \(\left[ \begin{array}{rrrr} 1 & 2 & -1 & 0 \\ 2 & 0 & 1 & 1 \\ 1 & 1 & 0 & -1 \\ 4 & 3 & 0 & 0 \end{array} \right]\) and \(\left[ \begin{array}{rrrr} 1 & -1 & 3 & 0 \\ -1 & 0 & 1 & 1 \\ 0 & -1 & 4 & 1 \\ 5 & -1 & -1 & -4 \end{array} \right]\) are not similar.

    If \(A \sim B\), show that:

    \(A^{T} \sim B^{T}\) \(A^{-1} \sim B^{-1}\) \(rA \sim rB\) for \(r\) in \(\mathbb{R}\) \(A^{n} \sim B^{n}\) for \(n \geq 1\)

    1. If \(B = P^{-1}AP\), then \(B^{-1} = P^{-1}A^{-1}(P^{-1})^{-1} = P^{-1}A^{-1}P\).

    In each case, decide whether the matrix \(A\) is diagonalizable. If so, find \(P\) such that \(P^{-1}AP\) is diagonal.

    \(\left[ \begin{array}{rrr} 1 & 0 & 0 \\ 1 & 2 & 1 \\ 0 & 0 & 1 \end{array} \right]\) \(\left[ \begin{array}{rrr} 3 & 0 & 6 \\ 0 & -3 & 0 \\ 5 & 0 & 2 \end{array} \right]\) \(\left[ \begin{array}{rrr} 3 & 1 & 6 \\ 2 & 1 & 0 \\ -1 & 0 & -3 \end{array} \right]\) \(\left[ \begin{array}{rrr} 4 & 0 & 0 \\ 0 & 2 & 2 \\ 2 & 3 & 1 \end{array} \right]\)

    1. Yes, \(P = \left[ \begin{array}{rrr} -1 & 0 & 6 \\ 0 & 1 & 0 \\ 1 & 0 & 5 \end{array} \right]\), \(P^{-1}AP = \left[ \begin{array}{rrr} -3 & 0 & 0 \\ 0 & -3 & 0 \\ 0 & 0 & 8 \end{array} \right]\)
    2. No, \(c_{A}(x) = (x + 1)(x - 4)^{2}\) so \(\lambda = 4\) has multiplicity 2. But \(dim \;(E_{4}) = 1\) so Theorem [thm:016250] applies.

    If \(A\) is invertible, show that \(AB\) is similar to \(BA\) for all \(B\).

    Show that the only matrix similar to a scalar matrix \(A = rI\), \(r\) in \(\mathbb{R}\), is \(A\) itself.

    Let \(\lambda\) be an eigenvalue of \(A\) with corresponding eigenvector \(\mathbf{x}\). If \(B = P^{-1}AP\) is similar to \(A\), show that \(P^{-1}\mathbf{x}\) is an eigenvector of \(B\) corresponding to \(\lambda\).

    If \(A \sim B\) and \(A\) has any of the following properties, show that \(B\) has the same property.

    1. Idempotent, that is \(A^{2} = A\).
    2. Nilpotent, that is \(A^{k} = 0\) for some \(k \geq 1\).
    3. Invertible.
    1. If \(B = P^{-1}AP\) and \(A^{k} = 0\), then \(B^{k} = (P^{-1}AP)^{k} = P^{-1}A^{k}P = P^{-1}0P = 0\).

    Let \(A\) denote an \(n \times n\) upper triangular matrix.

    1. If all the main diagonal entries of \(A\) are distinct, show that \(A\) is diagonalizable.
    2. If all the main diagonal entries of \(A\) are equal, show that \(A\) is diagonalizable only if it is already diagonal.
    3. Show that \(\left[ \begin{array}{rrr} 1 & 0 & 1 \\ 0 & 1 & 0 \\ 0 & 0 & 2 \end{array} \right]\) is diagonalizable but that \(\left[ \begin{array}{rrr} 1 & 1 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 2 \end{array} \right]\) is not diagonalizable.
    1. The eigenvalues of \(A\) are all equal (they are the diagonal elements), so if \(P^{-1}AP = D\) is diagonal, then \(D = \lambda I\). Hence \(A = P^{-1}(\lambda I)P = \lambda I\).

    Let \(A\) be a diagonalizable \(n \times n\) matrix with eigenvalues \(\lambda_{1}, \lambda_{2}, \dots, \lambda_{n}\) (including multiplicities). Show that:

    1. \(\det A = \lambda_{1}\lambda_{2}\cdots \lambda_{n}\)
    2. \(\func{tr} A = \lambda_{1} + \lambda_{2} + \cdots + \lambda_{n}\)
    1. \(A\) is similar to \(D = \func{diag}(\lambda_{1}, \lambda_{2}, \dots, \lambda_{n})\) so (Theorem [thm:016008]) \(\func{tr} A = \func{tr} D = \lambda_{1} + \lambda_{2} + \dots + \lambda_{n}\).

    Given a polynomial \(p(x) = r_{0} + r_{1}x + \dots + r_{n}x^{n}\) and a square matrix \(A\), the matrix \(p(A) = r_{0}I + r_{1}A + \dots + r_{n}A^{n}\) is called the evaluation of \(p(x)\) at \(A\). Let \(B = P^{-1}AP\). Show that \(p(B) = P^{-1}p(A)P\) for all polynomials \(p(x)\).

    [ex:5_5_12] Let \(P\) be an invertible \(n \times n\) matrix. If \(A\) is any \(n \times n\) matrix, write \(T_{P}(A) = P^{-1}AP\). Verify that:

    \(T_{P}(I) = I\) \(T_{P}(AB) = T_{P}(A)T_{P}(B)\) \(T_{P}(A + B) = T_{P}(A) + T_{P}(B)\) \(T_{P}(rA) = rT_{P}(A)\) \(T_{P}(A^{k}) = [T_{P}(A)]^{k}\) for \(k \geq 1\) If \(A\) is invertible, \(T_{P}(A^{-1}) = [T_{P}(A)]^{-1}\). If \(Q\) is invertible, \(T_{Q}[T_{P}(A)] = T_{PQ}(A)\).

    1. \(T_{P}(A)T_{P}(B) = (P^{-1}AP)(P^{-1}BP) = P^{-1}(AB)P = T_{P}(AB)\).
    1. Show that two diagonalizable matrices are similar if and only if they have the same eigenvalues with the same multiplicities.
    2. If \(A\) is diagonalizable, show that \(A \sim A^{T}\).
    3. Show that \(A \sim A^{T}\) if \(A = \left[ \begin{array}{rr} 1 & 1 \\ 0 & 1 \end{array} \right]\)
    1. If \(A\) is diagonalizable, so is \(A^{T}\), and they have the same eigenvalues. Use (a).

    If \(A\) is \(2 \times 2\) and diagonalizable, show that \(C(A) = \{X \mid XA = AX\}\) has dimension \(2\) or \(4\). [Hint: If \(P^{-1}AP = D\), show that \(X\) is in \(C(A)\) if and only if \(P^{-1}XP\) is in \(C(D)\).]

    If \(A\) is diagonalizable and \(p(x)\) is a polynomial such that \(p(\lambda) = 0\) for all eigenvalues \(\lambda\) of \(A\), show that \(p(A) = 0\) (see Example [exa:009262]). In particular, show \(c_{A}(A) = 0\). [Remark: \(c_{A}(A) = 0\) for all square matrices \(A\)—this is the Cayley-Hamilton theorem, see Theorem [thm:033262].]

    Let \(A\) be \(n \times n\) with \(n\) distinct real eigenvalues. If \(AC = CA\), show that \(C\) is diagonalizable.

    Let \(A = \left[ \begin{array}{rrr} 0 & a & b \\ a & 0 & c \\ b & c & 0 \end{array} \right]\) and \(B = \left[ \begin{array}{rrr} c & a & b \\ a & b & c \\ b & c & a \end{array} \right]\).

    1. Show that \(x^{3} - (a^{2} + b^{2} + c^{2})x - 2abc\) has real roots by considering \(A\).
    2. Show that \(a^{2} + b^{2} + c^{2} \geq ab + ac + bc\) by considering \(B\).
    1. \(c_{B}(x) = [x - (a + b + c)][x^{2} - k]\) where \(k = a^{2} + b^{2} + c^{2} - [ab + ac + bc]\). Use Theorem [thm:016397].

    Assume the \(2 \times 2\) matrix \(A\) is similar to an upper triangular matrix. If \(\func{tr} A = 0 = \func{tr} A^{2}\), show that \(A^{2} = 0\).

    Show that \(A\) is similar to \(A^{T}\) for all \(2 \times 2\) matrices \(A\). [Hint: Let \(A = \left[ \begin{array}{rr} a & b \\ c & d \end{array} \right]\). If \(c = 0\) treat the cases \(b = 0\) and \(b \neq 0\) separately. If \(c \neq 0\), reduce to the case \(c = 1\) using Exercise [ex:5_5_12](d).]

    Refer to Section [sec:3_4] on linear recurrences. Assume that the sequence \(x_{0}, x_{1}, x_{2}, \dots\) satisfies

    \[x_{n+k} = r_0x_n + r_1x_{n+1} + \dotsb + r_{k-1}x_{n+k-1} \nonumber \]

    for all \(n \geq 0\). Define

    \[A = \scriptsize \left[ \begin{array}{ccccc} 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \vdots & & \vdots \\ 0 & 0 & 0 & \cdots & 1 \\ r_0 & r_1 & r_2 & \cdots & r_{k-1} \end{array} \right], V_n = \left[ \begin{array}{cccc} x_n \\ x_{n+1} \\ \vdots \\ x_{n+k-1} \end{array} \right]. \nonumber \]

    Then show that:

    1. \(V_{n} = A^{n}V_{0}\) for all \(n\).
    2. \(c_{A}(x) = x^{k} - r_{k-1}x^{k-1} - \dots - r_{1}x - r_{0}\)
    3. If \(\lambda\) is an eigenvalue of \(A\), the eigenspace \(E_{\lambda}\) has dimension 1, and \(\mathbf{x} = (1, \lambda, \lambda^{2}, \dots, \lambda^{k-1})^{T}\) is an eigenvector. [Hint: Use \(c_{A}(\lambda) = 0\) to show that \(E_{\lambda} = \mathbb{R}\mathbf{x}\).]
    4. \(A\) is diagonalizable if and only if the eigenvalues of \(A\) are distinct. [Hint: See part (c) and Theorem [thm:016090].]
    5. If \(\lambda_{1}, \lambda_{2}, \dots, \lambda_{k}\) are distinct real eigenvalues, there exist constants \(t_{1}, t_{2}, \dots, t_{k}\) such that \(x_n = t_1\lambda_1^n + \dots + t_k\lambda_k^n\) holds for all \(n\). [Hint: If \(D\) is diagonal with \(\lambda_{1}, \lambda_{2}, \dots, \lambda_{k}\) as the main diagonal entries, show that \(A^{n}\) = \(PD^{n}P^{-1}\) has entries that are linear combinations of \(\lambda_1^n, \lambda_2^n, \dots, \lambda_k^n\).]

    Suppose \(A\) is \(2 \times 2\) and \(A^2=0\). If \(\func{tr} A \neq 0\) show that \(A=0\).


    5.5E: Similarity and Diagonalization Exercises is shared under a not declared license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?