# 3.7: Multiple Eigenvalues

It may very well happen that a matrix has some “repeated” eigenvalues. That is, the characteristic equation \(\det(A-\lambda I)=0\) may have repeated roots. As we have said before, this is actually unlikely to happen for a random matrix. If we take a small perturbation of \(A\) (we change the entries of \(A\) slightly), then we will get a matrix with distinct eigenvalues. As any system we will want to solve in practice is an approximation to reality anyway, it is not indispensable to know how to solve these corner cases. It may happen on occasion that it is easier or desirable to solve such a system directly.

#### 3.7.1 Geometric multiplicity

Take the diagonal matrix

\[ A = \begin{bmatrix}3&0\\0&3 \end{bmatrix} \]

\(A\) has an eigenvalue 3 of multiplicity 2. We call the multiplicity of the eigenvalue in the characteristic equation the algebraic multiplicity. In this case, there also exist 2 linearly independent eigenvectors, \(\begin{bmatrix}1\\0 \end{bmatrix}\) and \(\begin{bmatrix} 0\\1 \end{bmatrix}\) corresponding to the eigenvalue 3. This means that the so-called geometric multiplicity of this eigenvalue is also 2.

In all the theorems where we required a matrix to have \(n\) distinct eigenvalues, we only really needed to have \(n\) linearly independent eigenvectors. For example, \(\vec{x} = A \vec{x} \) has the general solution

\[\vec{x} = c_1 \begin{bmatrix} 1\\0 \end{bmatrix} e^{3t} + c_2 \begin{bmatrix} 0\\1 \end{bmatrix} e^{3t}. \]

Let us restate the theorem about real eigenvalues. In the following theorem we will repeat eigenvalues according to (algebraic) multiplicity. So for the above matrix \(A\), we would say that it has eigenvalues 3 and 3.

**Theorem 3.7.1. **Take \( \vec{x} = P \vec{x} \). Suppose the matrix P is \(n\times n \), has n real eigenvalues (not necessarily distinct), \( \lambda_1, \cdots, \lambda_n \)and there are \(n\) linearly independent corresponding eigenvectors\(\vec{v_1}, \cdots, \vec{v_n} \). Then the general solution to the ODE can be written as:

\[\vec{x}=c_1\vec{v_1}e^{\lambda_1 t} + c_2 \vec{v_2}e^{\lambda_2 t} + \cdot + c_n \vec{v_n}e^{\lambda_n t} \]

In other words, the hypothesis of the theorem could be stated as saying that if all the eigenvalues of P are complete, then there are n linearly independent eigenvectors and thus we have the given general solution.

If the geometric multiplicity of an eigenvalue is 2 or greater, then the set of linearly independent eigenvectors is not unique up to multiples as it was before. For example, for the diagonal matrix \(A = \begin{bmatrix} 3&0 \\ 0&3 \end{bmatrix} \) we could also pick eigenvectors \(\begin{bmatrix} 1\\1 \end{bmatrix} \) and \( \begin{bmatrix} 1\\-1 \end{bmatrix} \), or in fact any pair of two linearly independent vectors. The number of linearly independent eigenvectors corresponding to \(\lambda\) is the number of free variables we obtain when solving \(A\vec{v} = \lambda \vec{v} \). We pick specific values for those free variables to obtain eigenvectors. If you pick different values, you may get different eigenvectors.

#### 3.7.2 Defective eigenvalues

If an \(n \times n\) matrix has less than n linearly independent eigenvectors, it is said to be deficient. Then there is at least one eigenvalue with an algebraic multiplicity that is higher than its geometric multiplicity. We call this eigenvalue defective and the difference between the two multiplicities we call the defect.

Example 3.7.1

The matrix

\[ \begin{bmatrix} 3&1\\0&3 \end{bmatrix} \]

has an eigenvalue 3 of algebraic multiplicity 2. Let us try to compute eigenvectors.

\[ \begin{bmatrix} 0&1\\0&0 \end{bmatrix} \begin{bmatrix} v_1\\v_2 \end{bmatrix} = \vec{0} \]

**Solution**

We must have that \( v_2 = 0 \). Hence any eigenvector is of the form \(\begin{bmatrix} v_1\\ 0 \end{bmatrix} \). Any two such vectors are linearly dependent, and hence the geometric multiplicity of the eigenvalue is 1. Therefore, the defect is 1, and we can no longer apply the eigenvalue method directly to a system of ODEs with such a coefficient matrix.

Let us continue with the example \( A = \begin{bmatrix} 3&1\\ 0&3 \end{bmatrix} \) and the equation \(\vec{x} = A \vec{x} \). We have an eigenvalue \(\lambda =3\) of (algebraic) multiplicity 2 and defect 1. We have found one eigenvector \(\vec{v_1} = \begin{bmatrix} 1\\ 0 \end{bmatrix} \). We have the solution

\[\vec{x_1} = \vec{v_1} e^{3t} \]

In this case, let us try (in the spirit of repeated roots of the characteristic equation for a single equation) another solution of the form

\[ \vec{x_2} = ( \vec{v_2} + \vec{v_1} t ) e^{3t} \]

We differentiate to get

As we are assuming that \(\vec{x_2}\) is a solution, \(\vec{x_2}' \) must equal \(A\vec{x_2} \), and \( \vec{x_2}' = \vec{v_1} e^{3t} + 3(\vec{v_2} + \vec{v_1}t)e^{3t} =(3\vec{v_2}+\vec{v_1})e^{3t} + 3\vec{v_1}te^{3t} \)

\[ A \vec{x_2} = A(\vec{v_2} + \vec{v_1}t)e^{3t} = A\vec{v_2}e^{3t} + A\vec{v_1}te^{3t} \]

By looking at the coefficients of \(e^{3t}\)and \(te^{3t} \) we see \(3\vec{v_2} + \vec{v_1} = A\vec{v_2} \) and \(3\vec{v_1} = A\vec{v_1} \). This means that

\[(A- 3I)\vec{v_2} = \vec{v_1} {\rm{~and~}} (A - 3I)\vec{v_1} = \vec{0} \]

Therefore, \(\vec{x_2} \) is a solution if these two equations are satisfied. We know the second of these two equations is satisfied as \(\vec{v_1} \) is an eigenvector. If we plug the first equation into the second we obtain

\[ (A - 3I )(A - 3I)\vec{v_2} = \vec{0} {\rm{~or~}} (A - 3I)^2\vec{v_2} = \vec{0} \]

If we can, therefore, find a \( \vec{v_2} \) that solves \( (A -3I)^2 \vec{v_2} = \vec{0} \) and such that \( (A-3I) \vec{v_2} = \vec{v_1} \), then we are done. This is just a bunch of linear equations to solve and we are by now very good at that.

We notice that in this simple case \((A-3I)^2 \) is just the zero matrix (exercise). Hence, any vector \(\vec{v_2} \) solves \( (A-3I)^2 \vec{v_2} = \vec{0} \). We just have to make sure that \( (A-3I)\vec{v_2}=\vec{v_1} \). Write

\[ \begin{bmatrix} 0&1\\ 0&0 \end{bmatrix} \begin{bmatrix}a\\b \end{bmatrix} = \begin{bmatrix} 1\\0 \end{bmatrix} \]

By inspection we see that letting \(\alpha = 0\) (\(\alpha\) could be anything in fact) and \(b=1\) does the job. Hence we can take \( \vec{v_2}=\begin{bmatrix} 0\\1 \end{bmatrix} \). Our general solution to \(\vec{x}' = A \vec{x} \) is

\( \vec{x} = c_1 \begin{bmatrix}1\\0 \end{bmatrix}e^{3t} + c_2(\begin{bmatrix}0\\1 \end{bmatrix} + \begin{bmatrix}1\\0 \end{bmatrix}t ) e ^{3t} = \begin{bmatrix} c_1 e^{3t} + c_2 t e^{3t} \\ c_2e^{3t} \end{bmatrix} \)

Let us check that we really do have the solution. First \(\vec{x_1}' = c_1 3e^{3t} + c_2e^{3t} 3c_2te^{3t} = 3x_1 + x_2 \). Good. Now \( \vec{x_2}' = 3c_2e^{3t} = 3x_2 \). Good.

Note that the system \( \vec{x}' = A \vec{x} \) has a simpler solution since \(A\) is a so-called upper triangular matrix, that is every entry below the diagonal is zero. In particular, the equation for \( x_2 \) does not depend on \(x_1\). Mind you, not every defective matrix is triangular.

Exercise 3.7.1

Solve \( \vec{x}' = \begin{bmatrix} 3&1\\ 0&3 \end{bmatrix} \vec{x} \) by first solving for \(x_2\) and then for \( x_1 \) independently. Check that you got the same solution as we did above.

Let us describe the general algorithm. Suppose that \(\lambda \) is an eigenvalue of multiplicity 2, defect 1. First find an eigenvector \(\vec{v_1} \) of \( \lambda\). Then, find a vector \( \vec{v_2} \) such that

This gives us two linearly independent solutions

\[ \vec(x_1) = \vec{v_1} e^{\lambda t} \\ \vec(x_2) = (\vec{v_2} + \vec{v_1} t )e^(\lambda t ) \]

This machinery can also be generalized to higher multiplicities and higher defects. We will not go over this method in detail, but let us just sketch the ideas. Suppose that A has an eigenvalue \( \lambda \) of multiplicity m. We find vectors such that

Such vectors are called generalized eigenvectors. For every eigenvector \(\vec{v}_{1} \) we find a chain of generalized eigenvectors \( \vec{v}_{2} \) through \(vec{v}_{k} \) such that:

We form the linearly independent solutions

\[ \vec{x_1} = \vec{v_1} e^{\lambda t} \]

\[ \vec{x_2} = ( \vec{v_2} + \vec{v_1} t) e^{\lambda t } \]

\[ \vdots \]

\[ \vec{x_k} = \left( \vec{v_k}+ \vec{v}_{k-1} t + \vec{v}_{k-2} \frac{t^2}{2} + \cdots + \vec{v}_{2} \frac{t^{k-2}}{(k-2)!} + \vec{v}_{1}\frac{t^{k-1}}{(k-1)!} ) e^{\lambda t} \right) \]

Recall that \( k! = 1 \cdot 2 \cdot 3 \cdots (k-1) \cdot k \) is the factorial. We proceed to find chains until we form \( m \) linearly independent solutions (\( m \) is the multiplicity). You may need to find several chains for every eigenvalue.

### Contributors

- Jiří Lebl (Oklahoma State University).These pages were supported by NSF grants DMS-0900885 and DMS-1362337.