3: Eigenvalues and Eigenvectors
( \newcommand{\kernel}{\mathrm{null}\,}\)
eigenvalues and eigenvectors to real-world problems, including searching the Internet using Google’s PageRank algorithm.
- 3.1: Eigenvalues and Eigenvectors
- In this section, we define eigenvalues and eigenvectors. These form the most important facet of the structure theory of square matrices. As such, eigenvalues and eigenvectors tend to play a key role in the real-life applications of linear algebra.
- 3.3: Geometry of Eigenvalues
- An n×n matrix whose characteristic polynomial has n distinct real roots is diagonalizable: it is similar to a diagonal matrix, which is much simpler to analyze. The other possibility is that a matrix has complex roots, and that is the focus of this section. It turns out that such a matrix is similar (in the 2×2 case) to a rotation-scaling matrix, which is also relatively easy to understand.
- 3.4: Dot Products and Orthogonality
- In this chapter, it will be necessary to find the closest point on a subspace to a given point. The closest point has the property that the difference between the two points is orthogonal, or perpendicular, to the subspace. For this reason, we need to develop notions of orthogonality, length, and distance.
- 3.5: Orthogonal Projection and Least Squares
- Let W be a subspace of Rn and let x be a vector in Rn . In this section, we will learn to compute the closest vector xW to x in W. The vector xW is called the orthogonal projection of x onto W . This is exactly what we will use to almost solve matrix equations.