
# 7.4: Existence of Eigenvalues

$$\newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} }$$

$$\newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$

In what follows, we want to study the question of when eigenvalues exist for a given operator $$T$$. To answer this question, we will use polynomials $$p(z)\in \mathbb{F}[z]$$ evaluated on operators $$T \in \mathcal{L}(V,V)$$ (or, equivalently, on square matrices $$A \in \mathbb{F}^{n\times n}$$. More explicitly, given a polynomial

$p(z) = a_0 + a_1 z + \cdots + a_k z^k$

we can associate the operator

$p(T) = a_0 I_V + a_1 T + \cdots + a_k T^k.$

Note that, for $$p(z),q(z)\in \mathbb{F}[z]$$, we have

\begin{equation*}
(pq)(T) = p(T)q(T) = q(T)p(T).
\end{equation*}

The results of this section will be for complex vector spaces. This is because the proof of the existence of eigenvalues relies on the Fundamental Theorem of Algebra from Chapter 3, which makes a statement about the existence of zeroes of polynomials over $$\mathbb{C}$$.

Theorem 7.4.1: Existance

Let $$V\neq \{0\}$$ be a finite-dimensional vector space over $$\mathbb{C}$$, and let $$T\in\mathcal{L}(V,V)$$. Then $$T$$ has at least one eigenvalue.

Proof

Let $$v\in V$$with $$v\neq 0$$, and consider the list of vectors

\begin{equation*}
(v,Tv,T^2v,\ldots,T^nv),
\end{equation*}

where $$n=\dim(V)$$. Since the list contains $$n+1$$vectors, it must be linearly dependent. Hence, there exist scalars $$a_0,a_1,\ldots, a_n\in \mathbb{C}$$, not all zero, such that

\begin{equation*}
0 = a_0 v + a_1 Tv + a_2 T^2 v + \cdots + a_n T^n v.
\end{equation*}

Let $$m$$be largest index for which $$a_m\neq 0$$. Since $$v\neq 0$$, we must have $$m>0$$ (but possibly $$m=n$$. Consider the polynomial

\begin{equation*}
p(z) = a_0 + a_1 z + \cdots + a_m z^m.
\end{equation*}

By Theorem 3.2.3 (3) it can be factored as
\begin{equation*}
p(z) = c(z-\lambda_1)\cdots (z-\lambda_m),
\end{equation*}

where $$c,\lambda_1,\ldots,\lambda_m\in \mathbb{C}$$and $$c\neq 0$$.

Therefore,
\begin{equation*}
\begin{split}
0 &= a_0 v + a_1 Tv + a_2 T^2 v + \cdots + a_n T^n v = p(T)v\\
&= c(T-\lambda_1 I)(T-\lambda_2 I) \cdots (T-\lambda_m I)v,
\end{split}
\end{equation*}
and so at least one of the factors $$T-\lambda_j I$$ must be noninjective. In other words, this $$\lambda_j$$is an eigenvalue of $$T$$.

Note that the proof of Theorem 7.4.1 only uses basic concepts about linear maps, which is the same approach as in a popular textbook called Linear Algebra Done Right by Sheldon Axler. Many other textbooks rely on significantly more difficult proofs using concepts like the determinant and characteristic polynomial of a matrix. At the same time, it is often preferable to use the characteristic polynomial of a matrix in order to compute eigen-information of an operator; we discuss this approach in Chapter 8.

Note also that Theorem 7.4.1 does not hold for real vector spaces. E.g., as we saw in Example 7.2.2, the rotation operator $$R$$ on $$\mathbb{R}^2$$ has no eigenvalues.

### Contributors

Both hardbound and softbound versions of this textbook are available online at WorldScientific.com.