Search
- Filter Results
- Location
- Classification
- Include attachments
- https://math.libretexts.org/Bookshelves/Calculus/Calculus_3e_(Apex)/10%3A_Vectors/10.03%3A_The_Dot_ProductThe previous section introduced vectors and described how to add them together and how to multiply them by scalars. This section introduces a multiplication on vectors called the dot product.
- https://math.libretexts.org/Courses/Mission_College/MAT_04C_Linear_Algebra_(Kravets)/07%3A_Orthogonality/7.03%3A_Orthogonal_ProjectionLet W be a subspace of Rn and let x be a vector in Rn . In this section, we will learn to compute the closest vector xW to x in W. The vector xW is called the orthogonal projection of x...Let W be a subspace of Rn and let x be a vector in Rn . In this section, we will learn to compute the closest vector xW to x in W. The vector xW is called the orthogonal projection of x onto W . This is exactly what we will use to almost solve matrix equations.
- https://math.libretexts.org/Under_Construction/Purgatory/Differential_Equations_and_Linear_Algebra_(Zook)/18%3A_Orthonormal_Bases_and_Complements/18.04%3A_Gram-Schmidt_and_Orthogonal_Complementsv^{\perp} \cdot w^{\perp}&=v^{\perp} \cdot \left(w - \dfrac{u\cdot w}{u\cdot u}\,u - \dfrac{v^{\perp}\cdot w}{v^{\perp}\cdot v^{\perp}}\,v^{\perp} \right)\\ &=v^{\perp}\cdot w - \dfrac{u \cdot w}{u \c...v^{\perp} \cdot w^{\perp}&=v^{\perp} \cdot \left(w - \dfrac{u\cdot w}{u\cdot u}\,u - \dfrac{v^{\perp}\cdot w}{v^{\perp}\cdot v^{\perp}}\,v^{\perp} \right)\\ &=v^{\perp}\cdot w - \dfrac{u \cdot w}{u \cdot u}v^{\perp} \cdot u - \dfrac{v^{\perp} \cdot w}{v^{\perp} \cdot v^{\perp}} v^{\perp} \cdot v^{\perp} \\
- https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/06%3A_Orthogonality/6.03%3A_Orthogonal_ProjectionThis page explains the orthogonal decomposition of vectors concerning subspaces in Rn, detailing how to compute orthogonal projections using matrix representations. It includes methods f...This page explains the orthogonal decomposition of vectors concerning subspaces in Rn, detailing how to compute orthogonal projections using matrix representations. It includes methods for deriving projection matrices, with an emphasis on linear transformations and their properties. The text outlines the relationship between a subspace and its orthogonal complement, utilizing examples to illustrate projection calculations and reflections across subspaces.
- https://math.libretexts.org/Bookshelves/Linear_Algebra/Map%3A_Linear_Algebra_(Waldron_Cherney_and_Denton)/14%3A_Orthonormal_Bases_and_Complements/14.04%3A_Gram-Schmidt_and_Orthogonal_Complementsv^{\perp} \cdot w^{\perp}&=v^{\perp} \cdot \left(w - \dfrac{u\cdot w}{u\cdot u}\,u - \dfrac{v^{\perp}\cdot w}{v^{\perp}\cdot v^{\perp}}\,v^{\perp} \right)\\ &=v^{\perp}\cdot w - \dfrac{u \cdot w}{u \c...v^{\perp} \cdot w^{\perp}&=v^{\perp} \cdot \left(w - \dfrac{u\cdot w}{u\cdot u}\,u - \dfrac{v^{\perp}\cdot w}{v^{\perp}\cdot v^{\perp}}\,v^{\perp} \right)\\ &=v^{\perp}\cdot w - \dfrac{u \cdot w}{u \cdot u}v^{\perp} \cdot u - \dfrac{v^{\perp} \cdot w}{v^{\perp} \cdot v^{\perp}} v^{\perp} \cdot v^{\perp} \\