Processing math: 100%
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Mathematics LibreTexts

5.7: Multiple Eigenvalues

( \newcommand{\kernel}{\mathrm{null}\,}\)

 

It may very well happen that a matrix has some “repeated” eigenvalues. That is, the characteristic equation det(AλI)=0 may have repeated roots. As we have said before, this is actually unlikely to happen for a random matrix. If we take a small perturbation of A (we change the entries of A slightly), then we will get a matrix with distinct eigenvalues. As any system we will want to solve in practice is an approximation to reality anyway, it is not indispensable to know how to solve these corner cases. On the other hand, these cases do come up in applications from time to time. Furthermore, if we have distinct but very close eigenvalues, the behavior is similar to that of repeated eigenvalues, and so understanding that case will give us insight into what is going on.

Geometric Multiplicity

Take the diagonal matrix

A=[3003]

A has an eigenvalue 3 of multiplicity 2. We call the multiplicity of the eigenvalue in the characteristic equation the algebraic multiplicity. In this case, there also exist 2 linearly independent eigenvectors, [10] and [01] corresponding to the eigenvalue 3. This means that the so-called geometric multiplicity of this eigenvalue is also 2.

In all the theorems where we required a matrix to have n distinct eigenvalues, we only really needed to have n linearly independent eigenvectors. For example, x=Ax has the general solution

x=c1[10]e3t+c2[01]e3t.

The geometric multiplicity of an eigenvalue of algebraic multiplicity n is equal to the number of corresponding linearly independent eigenvectors. The geometric multiplicity is always less than or equal to the algebraic multiplicity. We have handled the case when these two multiplicities are equal. If the geometric multiplicity is equal to the algebraic multiplicity, then we say the eigenvalue is complete.

In other words, the hypothesis of the theorem could be stated as saying that if all the eigenvalues of P are complete, then there are n linearly independent eigenvectors and thus we have the given general solution.

If the geometric multiplicity of an eigenvalue is 2 or greater, then the set of linearly independent eigenvectors is not unique up to multiples as it was before. For example, for the diagonal matrix A=[3003] we could also pick eigenvectors [11] and [11], or in fact any pair of two linearly independent vectors. The number of linearly independent eigenvectors corresponding to λ is the number of free variables we obtain when solving Av=λv. We pick specific values for those free variables to obtain eigenvectors. If you pick different values, you may get different eigenvectors.

Defective Eigenvalues

If an n×n matrix has less than n linearly independent eigenvectors, it is said to be deficient. Then there is at least one eigenvalue with an algebraic multiplicity that is higher than its geometric multiplicity. We call this eigenvalue defective and the difference between the two multiplicities we call the defect.

The next theorem shows what to do in this situation.

Theorem 5.7.1

Suppose the n×n matrix A has an eigenvalue λ1 of multiplicity 2 and the associated eigenspace has dimension 1; that is, all λ1-eigenvectors of A are scalar multiples of an eigenvector x. Then there are infinitely many vectors u such that

(Aλ1I)u=x.

Moreover, if u is any such vector then

y1=xeλ1tand y2=ueλ1t+xteλ1t

are linearly independent solutions of y=Ay.

 

Example 5.7.2

Use Theorem 5.7.1 to find the general solution of the system

y=[112549]y

considered in Example 5.7.1 .

Solution

In Example 5.7.1 we saw that λ1=1 is an eigenvalue of multiplicity 2 of the coefficient matrix A in Equation ???, and that all of the eigenvectors of A are multiples of

x=[52].

Therefore

y1=[52]et

is a solution of Equation ???. From Theorem 5.7.1 , a second solution is given by y2=uet+xtet, where (AI)u=x. The augmented matrix of this system is

[102554102],

which is row equivalent to

[15212000],

Therefore the components of u must satisfy

u152u2=12,

where u2 is arbitrary. We choose u2=0, so that u1=1/2 and

u=[120].

Thus,

y2=[10]et2+[52]tet.

Since y1 and y2 are linearly independent by Theorem 5.7.1 , they form a fundamental set of solutions. Therefore the general solution is

y=c1[52]et+c2([10]et2+[52]tet).

Note that choosing the arbitrary constant u2 to be nonzero is equivalent to adding a scalar multiple of y1 to the second solution y2 (Exercise 10.5.33).

Example 5.7.2

Find the general solution of

y=[3410212225]y.

Solution

The characteristic polynomial of the coefficient matrix A in Equation ??? is

|3λ41021λ2225λ|=(λ1)(λ+1)2.

Hence, the eigenvalues are λ1=1 with multiplicity 1 and λ2=1 with multiplicity 2. Eigenvectors associated with λ1=1 must satisfy (AI)x=0. The augmented matrix of this system is

[2410020202260],

which is row equivalent to

[101001200000].

Hence, x1=x3 and x2=2x3, where x3 is arbitrary. Choosing x3=1 yields the eigenvector

x1=[121].

Therefore

y1=[121]et

is a solution of Equation ???. Eigenvectors associated with λ2=1 satisfy (A+I)x=0. The augmented matrix of this system is

[4410022202240],

which is row equivalent to

[110000100000].

Hence, x3=0 and x1=x2, where x2 is arbitrary. Choosing x2=1 yields the eigenvector

x2=[110],

so

y2=[110]et

is a solution of Equation ???. Since all the eigenvectors of A associated with λ2=1 are multiples of x2, we must now use Theorem 5.7.1 to find a third solution of Equation ??? in the form

y3=uet+[110]tet,

where u is a solution of (A+I)u=x2. The augmented matrix of this system is

[4410122212240],

which is row equivalent to

[1101001120000].

Hence, u3=1/2 and u1=1u2, where u2 is arbitrary. Choosing u2=0 yields

u=[1012],

and substituting this into Equation ??? yields the solution

y3=[201]et2+[110]tet

of Equation ???. Since the Wronskian of {y1,y2,y3} at t=0 is

|1112101012|=12,

{y1,y2,y3} is a fundamental set of solutions of Equation ???. Therefore the general solution of Equation ??? is

y=c1[121]et+c2[110]et+c3([201]et2+[110]et).

Below is a video on solving a defective system of differential equations.

Note that the system x=Ax has a simpler solution since A is a so-called upper triangular matrix, that is every entry below the diagonal is zero. In particular, the equation for x2 does not depend on x1. Mind you, not every defective matrix is triangular.

Exercise 3.7.1

Solve x=[3103]x by first solving for x2 and then for x1 independently. Check that you got the same solution as we did above.

Let us describe the general algorithm. Suppose that λ is an eigenvalue of multiplicity 2, defect 1. First find an eigenvector v1 of λ. Then, find a vector v2 such that

(AλI)v2=v1

This gives us two linearly independent solutions

x1=v1eλtx2=(v2+v1t)eλt

Example 5.7.2

Consider the system x=[250020141]x. Compute the eigenvalues,

Solution

0=det(AλI)=det([2λ5002λ0141λ])=(2λ)2(1λ). The eigenvalues are 1 and 2, where 2 has multiplicity 2. We leave it to the reader to find that [001] is an eigenvector for the eigenvalue λ=1.

Let’s focus on λ=2. We compute eigenvectors: 0=(A2I)v=[050000141][v1v2v3]. The first equation says that v2=0, so the last equation is v1v3=0. Let v3 be the free variable to find that v1=v3. Perhaps let v3=1 to find an eigenvector [101]. Problem is that setting v3 to anything else just gets multiples of this vector and so we have a defect of 1. Let v1 be the eigenvector and let’s look for a generalized eigenvector v2: (A2I)v2=v1, or [050000141][abc]=[101], where we used a, b, c as components of v2 for simplicity. The first equation says 5b=1 so b=15. The second equation says nothing. The last equation is a+4bc=1, or a+45+c=1, or a+c=15. We let c be the free variable and we choose c=0. We find v2=[15150].

The general solution is therefore, x=c1[001]et+c2[101]e2t+c3([15150]+[101]t)e2t.

This machinery can also be generalized to higher multiplicities and higher defects. We will not go over this method in detail, but let us just sketch the ideas. Suppose that A has an eigenvalue λ of multiplicity m. We find vectors such that

(AλI)k(v)=(0),but(AλI)k1v0

Such vectors are called generalized eigenvectors (then v1=(AλI)k1v is an eigenvector). For every eigenvector v1 we find a chain of generalized eigenvectors v2 through vk such that:

(AλI)v1=0,(AλI)v2=v1,(AλI)vk=vk1.

Really once you find the vk such that (AλI)kvk=0 but (AλI)k1vk0, you find the entire chain since you can compute the rest, vk1=(AλI)vk, vk2=(AλI)vk1, etc. We form the linearly independent solutions

x1=v1eλtx2=(v2+v1t)eλtxk=(vk+vk1t+vk2t22++v2tk2(k2)!+v1tk1(k1)!)eλt)

Recall that k!=123(k1)k is the factorial. If you have an eigenvalue of geometric multiplicity , you will have to find such chains (some of them might be short: just the single eigenvector equation). We go until we form m linearly independent solutions where m is the algebraic multiplicity. We don’t quite know which specific eigenvectors go with which chain, so start by finding vk first for the longest possible chain and go from there.

For example, if λ is an eigenvalue of A of algebraic multiplicity 3 and defect 2, then solve (AλI)v1=0,(AλI)v2=v1,(AλI)v3=v2. That is, find v3 such that (AλI)3v3=0, but (AλI)2v30. Then you are done as v2=(AλI)v3 and v1=(AλI)v2. The 3 linearly independent solutions are x1=v1eλt,x2=(v2+v1t)eλt,x3=(v3+v2t+v1t22)eλt.

If on the other hand A has an eigenvalue λ of algebraic multiplicity 3 and defect 1, then solve (AλI)v1=0,(AλI)v2=0,(AλI)v3=v2. Here v1 and v2 are actual honest eigenvectors, and v3 is a generalized eigenvector. So there are two chains. To solve, first find a v3 such that (AλI)2v3=0, but (AλI)v30. Then v2=(AλI)v3 is going to be an eigenvector. Then solve for an eigenvector v1 that is linearly independent from v2. You get 3 linearly independent solutions x1=v1eλt,x2=v2eλt,x3=(v3+v2t)eλt.


This page titled 5.7: Multiple Eigenvalues is shared under a CC BY-SA 4.0 license and was authored, remixed, and/or curated by Jiří Lebl.

Support Center

How can we help?