# 5: Eigenvalues and Eigenvectors

- Page ID
- 70205

Solve the matrix equation \(Ax=\lambda x.\)

This chapter constitutes the core of any first course on linear algebra: eigenvalues and eigenvectors play a crucial role in most real-world applications of the subject.

Example \(\PageIndex{1}\)

In a population of rabbits,

- half of the newborn rabbits survive their first year;
- of those, half survive their second year;
- the maximum life span is three years;
- rabbits produce 0, 6, 8 baby rabbits in their first, second, and third years, respectively.

What is the *asymptotic* behavior of this system? What will the rabbit population look like in 100 years?

In Section 5.1, we will define eigenvalues and eigenvectors, and show how to compute the latter; in Section 5.2 we will learn to compute the former. In Section 5.3 we introduce the notion of *similar* matrices, and demonstrate that similar matrices do indeed behave similarly. In Section 5.4 we study matrices that are similar to diagonal matrices and in Section 5.5 we study matrices that are similar to rotation-scaling matrices, thus gaining a solid geometric understanding of large classes of matrices. Finally, we spend Section 5.6 presenting a common kind of application of eigenvalues and eigenvectors to real-world problems, including searching the Internet using Google’s PageRank algorithm.

- 5.1: Eigenvalues and Eigenvectors
- In this section, we define eigenvalues and eigenvectors. These form the most important facet of the structure theory of square matrices. As such, eigenvalues and eigenvectors tend to play a key role in the real-life applications of linear algebra.

- 5.2: The Characteristic Polynomial
- In Section 1 we discussed how to decide whether a given number λ is an eigenvalue of a matrix, and if so, how to find all of the associated eigenvectors. In this section, we will give a method for computing all of the eigenvalues of a matrix. This does not reduce to solving a system of linear equations: indeed, it requires solving a nonlinear equation in one variable, namely, finding the roots of the characteristic polynomial.

- 5.4: Diagonalization
- Diagonal matrices are the easiest kind of matrices to understand: they just scale the coordinate directions by their diagonal entries. This section is devoted to the question: “When is a matrix similar to a diagonal matrix?” This section is devoted to the question: “When is a matrix similar to a diagonal matrix?” We will see that the algebra and geometry of such a matrix is relatively easy to understand.

- 5.5: Complex Eigenvalues
- An n×n matrix whose characteristic polynomial has n distinct real roots is diagonalizable: it is similar to a diagonal matrix, which is much simpler to analyze. The other possibility is that a matrix has complex roots, and that is the focus of this section. It turns out that such a matrix is similar (in the 2×2 case) to a rotation-scaling matrix, which is also relatively easy to understand.

- 5.6: Stochastic Matrices
- This section is devoted to one common kind of application of eigenvalues: to the study of difference equations, in particular to Markov chains. We will introduce stochastic matrices, which encode this type of difference equation, and will cover in detail the most famous example of a stochastic matrix: the Google Matrix.

- 5.3: Similarity
- In this section, we study in detail the situation when two matrices behave similarly with respect to different coordinate systems. In Section 5.4 and Section 5.5, we will show how to use eigenvalues and eigenvectors to find a simpler matrix that behaves like a given matrix.