Processing math: 100%
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Mathematics LibreTexts

7: Orthogonality

( \newcommand{\kernel}{\mathrm{null}\,}\)

  • 7.1: Dot Products and Orthogonality
    In this chapter, it will be necessary to find the closest point on a subspace to a given point. The closest point has the property that the difference between the two points is orthogonal, or perpendicular, to the subspace. For this reason, we need to develop notions of orthogonality, length, and distance.
  • 7.2: Orthogonal Complements
    It will be important to compute the set of all vectors that are orthogonal to a given set of vectors. It turns out that a vector is orthogonal to a set of vectors if and only if it is orthogonal to the span of those vectors, which is a subspace, so we restrict ourselves to the case of subspaces.
  • 7.3: Orthogonal Projection
    Let W be a subspace of Rn and let x be a vector in Rn . In this section, we will learn to compute the closest vector xW to x in W. The vector xW is called the orthogonal projection of x onto W . This is exactly what we will use to almost solve matrix equations.
  • 7.4: Orthogonal Sets and Gram-Schmidt Process
  • 7.5: The Method of Least Squares
  • 7.6: Orthogonal Diagonalization


This page titled 7: Orthogonality is shared under a GNU Free Documentation License 1.3 license and was authored, remixed, and/or curated by Dan Margalit & Joseph Rabinoff via source content that was edited to the style and standards of the LibreTexts platform.

Support Center

How can we help?