Skip to main content
Mathematics LibreTexts

7.6: Diagonalization of \(2\times 2\) matrices and Applications

  • Page ID
    1242
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Let \(A = \begin{bmatrix} a&b\\ c&d \end{bmatrix} \in \mathbb{F}^{2\times 2}\), and recall that we can define a linear operator \(T \in \mathcal{L}(\mathbb{F}^{2})\) on \(\mathbb{F}^{2}\) by setting \(T(v) = A v\) for each \(v = \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} \in \mathbb{F}^2\).

    One method for finding the eigen-information of \(T\) is to analyze the solutions of the matrix equation \(A v = \lambda v\) for \(\lambda \in \mathbb{F}\) and \(v \in \mathbb{F}^{2}\). In particular, using the definition of eigenvector and eigenvalue, \(v\) is an eigenvector associated to the eigenvalue \(\lambda\) if and only if \(A v = T(v) = \lambda v\).

    A simpler method involves the equivalent matrix equation \((A - \lambda I)v = 0\), where \(I\) denotes the identity map on \(\mathbb{F}^{2}\). In particular, \(0 \neq v \in \mathbb{F}^{2}\) is an eigenvector for \(T\) associated to the eigenvalue \(\lambda \in \mathbb{F}\) if and only if the system of linear equations

    \begin{equation}
    \left.
    \begin{array}{rrrrr}
    (a - \lambda) v_{1} & + & b v_{2} & = & 0 \\
    c v_{1} & + & (d - \lambda) v_{2} & = & 0
    \end{array}
    \right\} \label{7.6.1}
    \end{equation}

    has a non-trivial solution. Moreover, System \ref{7.6.1} has a non-trivial solution if and only if the polynomial \(p(\lambda) = (a - \lambda)(d - \lambda) - bc\) evaluates to zero. (See Proof-writing Exercise 12 in Exercises for Chapter 7.)

    In other words, the eigenvalues for \(T\) are exactly the \(\lambda \in \mathbb{F}\) for which \(p(\lambda) = 0\), and the eigenvectors for \(T\) associated to an eigenvalue \(\lambda\) are exactly the non-zero vectors \(v = \begin{bmatrix} v_{1} \\ v_{2} \end{bmatrix} \in \mathbb{F}^2\) that satisfy System \ref{7.6.1}.

    Example \(\PageIndex{1}\)

    Let \(A = \begin{bmatrix} -2 & -1 \\ 5 & 2 \end{bmatrix}\). Then \(p(\lambda) = (-2 -\lambda)(2 - \lambda) - (-1)(5) = \lambda^{2} + 1\), which is equal to zero exactly when \(\lambda = \pm i\). Moreover, if \(\lambda = i\), then the System(7.6.1) becomes

    \[
    \left.
    \begin{array}{rrrrr}
    (-2 - i) v_{1} & - & v_{2} & = & 0 \\
    5 v_{1} & + & (2 - i) v_{2} & = & 0
    \end{array}
    \right\},
    \]

    which is satisfied by any vector \(v = \begin{bmatrix} v_1\\ v_2 \end{bmatrix}\in \mathbb{C}^2\) such that \(v_{2} = (-2 - i) v_{1}\). Similarly, if \(\lambda = -i\), then the System \ref{7.6.1} becomes

    \[
    \left.
    \begin{array}{rrrrr}
    (-2 + i) v_{1} & - & v_{2} & = & 0 \\
    5 v_{1} & + & (2 + i) v_{2} & = & 0
    \end{array}
    \right\},
    \]

    which is satisfied by any vector \(v = \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} \in \mathbb{C}^2\) such that \(v_{2} = (-2 + i) v_{1}\).

    It follows that, given \(A = \begin{bmatrix} -2 & -1 \\ 5 & 2 \end{bmatrix}\), the linear operator on \(\mathbb{C}^{2}\)defined by \(T(v) = A v\) has eigenvalues \(\lambda = \pm i\), with associated eigenvectors as described above.

    Example \(\PageIndex{2}\)

    Take the rotation \(R_\theta:\mathbb{R}^2 \to \mathbb{R}^2\) by an angle \(\theta \in [0,2\pi)\) given by the matrix

    \begin{equation*}
    R_\theta = \begin{bmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{bmatrix}
    \end{equation*}

    Then we obtain the eigenvalues by solving the polynomial equation

    \begin{equation*}
    \begin{split}
    p(\lambda) &= (\cos \theta -\lambda)^2 + \sin^2 \theta\\
    &= \lambda^2-2\lambda \cos \theta + 1 =0,
    \end{split}
    \end{equation*}

    where we have used the fact that \(\sin^2 \theta + \cos^2 \theta =1\). Solving for \(\lambda\)in \(\mathbb{C}\), we obtain

    \begin{equation*}
    \lambda = \cos \theta \pm \sqrt{\cos^2 \theta -1} = \cos\theta \pm \sqrt{-\sin^2 \theta}
    = \cos \theta \pm i \sin \theta = e^{\pm i \theta}.
    \end{equation*}

    We see that, as an operator over the real vector space \(\mathbb{R}^2\), the operator \(R_\theta\) only has eigenvalues when \(\theta=0\) or \(\theta=\pi\). However, if we interpret the vector \(\begin{bmatrix} x_1 \\ x_2 \end{bmatrix} \in \mathbb{R}^2\)as a complex number \(z=x_1+ix_2\), then \(z\)is an eigenvector if \(R_\theta:\mathbb{C}\to\mathbb{C}\) maps \(z\mapsto \lambda z=e^{\pm i \theta}z\). Moreover, from Section 2.3, we know that multiplication by \(e^{\pm i \theta}\)corresponds to rotation by the angle \(\pm\theta\).


    This page titled 7.6: Diagonalization of \(2\times 2\) matrices and Applications is shared under a not declared license and was authored, remixed, and/or curated by Isaiah Lankham, Bruno Nachtergaele, & Anne Schilling.

    • Was this article helpful?