Loading [MathJax]/jax/input/MathML/config.js
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Mathematics LibreTexts

Search

  • Filter Results
  • Location
  • Classification
    • Article type
    • Stage
    • Author
    • Embed Hypothes.is?
    • Cover Page
    • License
    • Show Page TOC
    • Transcluded
    • PrintOptions
    • OER program or Publisher
    • Autonumber Section Headings
    • License Version
    • Print CSS
    • Screen CSS
  • Include attachments
Searching in
About 7 results
  • https://math.libretexts.org/Courses/De_Anza_College/Linear_Algebra%3A_A_First_Course/04%3A_R/4.08%3A_Orthogonal_Vectors_and_Matrices/4.8.E%3A_Exercise_for_Section_4.8
    This page outlines exercises on determining orthogonality and orthonormality of vectors, classifying matrices (symmetric, skew symmetric, orthogonal), and the properties of orthogonal matrices, such a...This page outlines exercises on determining orthogonality and orthonormality of vectors, classifying matrices (symmetric, skew symmetric, orthogonal), and the properties of orthogonal matrices, such as preserving vector lengths.
  • https://math.libretexts.org/Courses/De_Anza_College/Linear_Algebra%3A_A_First_Course/04%3A_R/4.09%3A_Gram-Schmidt_Process/4.9.E%3A_Exercises_for_Section_4.9
    This page outlines exercises utilizing the Gram-Schmidt process to derive orthonormal bases from various vector sets in \( \mathbb{R}^2 \), \( \mathbb{R}^3 \), and \( \mathbb{R}^4 \). Key exercises in...This page outlines exercises utilizing the Gram-Schmidt process to derive orthonormal bases from various vector sets in \( \mathbb{R}^2 \), \( \mathbb{R}^3 \), and \( \mathbb{R}^4 \). Key exercises include finding bases for pairs and spans of vectors, addressing restrictions, identifying bases for subspaces, and applying the process to different vector sets. Comprehensive solutions accompany each exercise.
  • https://math.libretexts.org/Courses/De_Anza_College/Linear_Algebra%3A_A_First_Course/06%3A_Spectral_Theory/6.07%3A_Orthogonal_Diagonalization
    In this section we look at matrices that have an orthonormal set of eigenvectors.
  • https://math.libretexts.org/Bookshelves/Differential_Equations/A_Second_Course_in_Ordinary_Differential_Equations%3A_Dynamical_Systems_and_Boundary_Value_Problems_(Herman)/05%3A_Fourier_Series/5.02%3A_Fourier_Trigonometric_Series
    \[\dfrac{a_{0}}{2} \int_{0}^{2 \pi} \cos m x d x+\sum_{n=1}^{\infty}\left[a_{n} \int_{0}^{2 \pi} \cos n x \cos m x d x+b_{n} \int_{0}^{2 \pi} \sin n x \cos m x d x\right]. \label{5.6} \] \int_{0}^{2 \...\[\dfrac{a_{0}}{2} \int_{0}^{2 \pi} \cos m x d x+\sum_{n=1}^{\infty}\left[a_{n} \int_{0}^{2 \pi} \cos n x \cos m x d x+b_{n} \int_{0}^{2 \pi} \sin n x \cos m x d x\right]. \label{5.6} \] \int_{0}^{2 \pi} \cos n x \cos m x d x &=\dfrac{1}{2} \int_{0}^{2 \pi}[\cos (m+n) x+\cos (m-n) x] d x \\[4pt] \[\int_{0}^{2 \pi} \sin m x \cos m x d x=\dfrac{1}{2} \int_{0}^{2 \pi} \sin 2 m x d x=\dfrac{1}{2}\left[\dfrac{-\cos 2 m x}{2 m}\right]_{0}^{2 \pi}=0. \nonumber \]
  • https://math.libretexts.org/Courses/De_Anza_College/Linear_Algebra%3A_A_First_Course/04%3A_R/4.09%3A_Gram-Schmidt_Process
    The Gram-Schmidt process is an algorithm to transform a set of vectors into an orthonormal set spanning the same subspace, that is generating the same collection of linear combinations.
  • https://math.libretexts.org/Courses/De_Anza_College/Linear_Algebra%3A_A_First_Course/04%3A_R/4.08%3A_Orthogonal_Vectors_and_Matrices
    In this section, we examine what it means for vectors (and sets of vectors) to be orthogonal and orthonormal. First, it is necessary to review some important concepts. You may recall the definitions f...In this section, we examine what it means for vectors (and sets of vectors) to be orthogonal and orthonormal. First, it is necessary to review some important concepts. You may recall the definitions for the span of a set of vectors and a linear independent set of vectors.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/06%3A_Orthogonality/6.04%3A_The_Method_of_Least_Squares
    This page covers orthogonal projections in vector spaces, detailing the advantages of orthogonal sets and defining the simpler Projection Formula applicable with orthogonal bases. It includes examples...This page covers orthogonal projections in vector spaces, detailing the advantages of orthogonal sets and defining the simpler Projection Formula applicable with orthogonal bases. It includes examples of projecting vectors onto subspaces, emphasizes the importance of orthogonal bases, and introduces the Gram-Schmidt process for generating orthogonal bases from sets of vectors.

Support Center

How can we help?