Skip to main content
Mathematics LibreTexts

Eigenvalues and Eigenvectors

  • Page ID
    218318
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\dsum}{\displaystyle\sum\limits} \)

    \( \newcommand{\dint}{\displaystyle\int\limits} \)

    \( \newcommand{\dlim}{\displaystyle\lim\limits} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \(\newcommand{\longvect}{\overrightarrow}\)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    Eigenvalues and Eigenvectors

    Definitions

    If L is a linear transformation from a vector space to itself, then of interest is whether there are any vectors v have the property that L(v) is a multiple of v.  If this is the case then repeatedly applying L will result in a vector always parallel to v.  This idea is if fundamental importance for applications in physics, mechanics, economics, biology, and just about every other scientific field.  We can state this in terms of matrices as follows.

     

    Definition

    Let A be an n x n matrix.  Then a scalar \( \lambda \)  is called an eigenvalue with associated nonzero eigenvector v if 

            \( Av = \lambda v \)

     

    Example

    For the matrix 

           \( A = \begin{pmatrix} 1 & 3 \\ 2 & 2 \end{pmatrix} \)

    we can check that -1 and 4 are eigenvalues with associated eigenvectors

           \( v_{-1} = \begin{pmatrix} 3 \\ -2 \end{pmatrix}  \) and \( v_{4} = \begin{pmatrix} 1 \\ 1 \end{pmatrix}  \)

    Notice that if v is an eigenvector with eigenvalue \( \lambda \), then so is any multiple of v since 

            A(cv)  =  cAv  =  c(lv)  =  l(cv)

            

    Example

    The identity matrix In has 1 as its only eigenvalue.  All vectors are associated eigenvectors since

            Inv  =  v  =  (1)v 

    for all v


    Finding Eigenvalues and Eigenvectors

    We will now determine how to find the eigenvalues and eigenvectors for a matrix.  Let A be a matrix with eigenvalue \( \lambda \) and eigenvector v.  Then

            Av  =  \( \lambda \)v        

    implies that

            Av - \( \lambda \)v0 

    We would like to factor out a v from the above (right) equation.  However, we must be careful because the term A - \( \lambda \) does not make sense.  Instead we have

            (A -\( \lambda \)I)v  =  0

    If there is a nontrivial solution, then 

            det(A - \( \lambda \)I)  =  0 

    and the solution is in the null space of A = \( \lambda \)I.

     

    Example

    Find the eigenvalues and eigenvectors of

           \( A = \begin{pmatrix} 3 & 2 \\ 3 & 4 \end{pmatrix} \)

    We have

           \( A - \lambda I = \begin{pmatrix} 3 - \lambda & 2 \\ 3 & 4 - \lambda \end{pmatrix} \)

    which has determinant

            (3 - l)(4 - l) - 6  =  \( \lambda \)2 - 7\( \lambda \) + 12 - 6  

            =  \( \lambda \)2 - 7\( \lambda \) + 6  =  (\( \lambda \) - 1)(\( \lambda \) - 6)

    So the roots are 

            \( \lambda \)  =  1         and        \( \lambda \)  =  6

    Now lets find the eigenvectors.  For \( \lambda \)  =  1, we have

           \( A - I = \begin{pmatrix} 2 & 2 \\ 3 & 3 \end{pmatrix} \)

    which has rref

           \( A = \begin{pmatrix} 1 & 1 \\ 0 & 0 \end{pmatrix} \)

    A nonzero vector in the null space is 

           \( v_1 = \begin{pmatrix} 1  \\ -1 \end{pmatrix} \)

    Now for the eigenvector corresponding to \( \lambda \)  =  6.  We have

           \( A - 6I = \begin{pmatrix} -3 & 2 \\ 3 & -2 \end{pmatrix} \)

    Notice that the second row is redundant.  At this point it is pretty easy to see that  

           \( v_6 = \begin{pmatrix} 2  \\ 3 \end{pmatrix} \)

    notice that there are lots of choices for these vectors (all multiples the above vector).  We made our choice in order to avoid fractions.


    The Characteristic Polynomial

    We have seen that in order to find the eigenvalues, we just find the roots of the polynomial defined by

            pA(\( \lambda \))  =  det(\( \lambda \)I - A)

    We call this polynomial the characteristic polynomial.  The degree of the polynomial will be n where A is an n x n matrix.  The roots of the characteristic polynomial will be the eigenvalues.  Notice that if 0 is a root then

            det(0I - A)  =  det(A)  =  0

    This tells us that 0 is an eigenvalue if and only if det(A)  =  0.  We can now add one more statement to the nonsingular equivalents.

    Theorem

    Let A be an n x n matrix.  Then TFAE

    1. A is nonsingular.
       
    2. Ax  =  0  has only the trivial solution.
       
    3. A is row equivalent to I.
       
    4. Ax  =  b has a unique solution for all b
       
    5. det(A) is nonzero.
       
    6. A has rank n.
       
    7. A has nullity 0.
       
    8. The rows of A are linearly independent.
       
    9. The columns of A are linearly independent.
       
    10. 0 is not an eigenvalue of A.

     



    Back to the Vectors Page

     

     

    Eigenvalues and Eigenvectors is shared under a CC BY license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?