Loading [MathJax]/jax/output/HTML-CSS/jax.js
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Mathematics LibreTexts

Search

  • Filter Results
  • Location
  • Classification
    • Article type
    • Stage
    • Author
    • Embed Hypothes.is?
    • Cover Page
    • License
    • Show Page TOC
    • Transcluded
    • PrintOptions
    • OER program or Publisher
    • Autonumber Section Headings
    • License Version
    • Print CSS
    • Screen CSS
  • Include attachments
Searching in
About 41 results
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)/05%3A_Linear_algebra_and_computing
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)/04%3A_Eigenvalues_and_eigenvectors
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)/06%3A_Orthogonality_and_Least_Squares/6.02%3A_Orthogonal_complements_and_the_matrix_tranpose
    This section introduces the notion of an orthogonal complement, the set of vectors each of which is orthogonal to a prescribed subspace. We'll also find a way to describe dot products using matrix pro...This section introduces the notion of an orthogonal complement, the set of vectors each of which is orthogonal to a prescribed subspace. We'll also find a way to describe dot products using matrix products, which allows us to study orthogonality using many of the tools for understanding linear systems that we developed earlier.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)/02%3A_Vectors_matrices_and_linear_combinations/2.01%3A_Vectors_and_linear_combinations
    \(\newcommand{\avec} \newcommand{\mathbf b} \newcommand{\mathbf c} \newcommand{\mathbf d} \newcommand{\dtil}{\widetilde{\mathbf d}} \newcommand{\mathbf e} \newcommand{\mathbf f} \newcommand{\mathbf n}...\(\newcommand{\avec} \newcommand{\mathbf b} \newcommand{\mathbf c} \newcommand{\mathbf d} \newcommand{\dtil}{\widetilde{\mathbf d}} \newcommand{\mathbf e} \newcommand{\mathbf f} \newcommand{\mathbf n} \newcommand{\mathbf p} \newcommand{\mathbf q} \newcommand{\mathbf s} \newcommand{\mathbf t} \newcommand{\mathbf u} \newcommand{\mathbf v} \newcommand{\mathbf w} \newcommand{\mathbf x} \newcommand{\yvec} \newcommand{\zvec} \newcommand{\rvec} \newcommand{\mvec} \newcommand{\zerovec} \newcommand{\one…
  • https://math.libretexts.org/Workbench/Understanding_Linear_Algebra_2e_(Austin)/01%3A_Systems_of_equations/1.03%3A_Computation_with_Sage
    This is a very rough measure of the effort required to find the reduced row echelon form; a more careful accounting shows that the number of arithmetic operations is roughly 23n3. As...This is a very rough measure of the effort required to find the reduced row echelon form; a more careful accounting shows that the number of arithmetic operations is roughly 23n3. As we have seen, some matrices require more effort than others, but the upshot of this observation is that the effort is proportional to n3. We can think of this in the following way: If the size of the matrix grows by a factor of 10, then the effort required grows by a factor of \(10^3 =…
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)/02%3A_Vectors_matrices_and_linear_combinations
    We began our study of linear systems in Chapter 1 where we described linear systems in terms of augmented matrices, such as \left[\begin{array}{rrr|r} 1 & 2 & -1 & 3 \\ -3 & 3 & -1 & 2 \\ 2 & 3 & 2 & ...We began our study of linear systems in Chapter 1 where we described linear systems in terms of augmented matrices, such as \left[121333122321\right] In this chapter, we will uncover geometric information in a matrix like this, which will lead to an intuitive understanding of the insights we previously gained into the solutions of linear systems.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)/01%3A_Systems_of_equations/1.03%3A_Computation_with_Sage
    Notice that there are three separate things (we call them arguments) inside the parentheses: the number of rows, the number of columns, and the entries of the matrix listed by row inside square bracke...Notice that there are three separate things (we call them arguments) inside the parentheses: the number of rows, the number of columns, and the entries of the matrix listed by row inside square brackets. We can think of this in the following way: If the size of the matrix grows by a factor of 10 , then the effort required grows by a factor of 103=1000.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)/06%3A_Orthogonality_and_Least_Squares/6.03%3A_Orthogonal_bases_and_projections
    Remember that an invertible matrix must be a square matrix, and the matrix Q will only be square if n=m. In this case, there are m vectors in the orthonormal set so the subspace \(W...Remember that an invertible matrix must be a square matrix, and the matrix Q will only be square if n=m. In this case, there are m vectors in the orthonormal set so the subspace W spanned by the vectors u1,u2,,um is Rm. If b is a vector in Rm, then ˆb=QQTb is the orthogonal projection of b onto Rm. In other words, \(QQ^T\m…
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)
    This book arises from my belief that linear algebra, as presented in a traditional undergraduate curriculum, has for too long lived in the shadow of calculus. Many mathematics programs currently requi...This book arises from my belief that linear algebra, as presented in a traditional undergraduate curriculum, has for too long lived in the shadow of calculus. Many mathematics programs currently require their students to complete at least three semesters of calculus, but only one semester of linear algebra, which often has two semesters of calculus as a prerequisite.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)/07%3A_The_Spectral_Theorem_and_singular_value_decompositions/7.01%3A_Symmetric_matrices_and_variance
    Notice that the matrix A has eigenvectors v1 and v2 that not only form a basis for R2 but, in fact, form an orthogonal basis for R2. Give...Notice that the matrix A has eigenvectors v1 and v2 that not only form a basis for R2 but, in fact, form an orthogonal basis for R2. Given the prominent role played by orthogonal bases in the last chapter, we would like to understand what conditions on a matrix enable us to form an orthogonal basis of eigenvectors.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Understanding_Linear_Algebra_(Austin)/05%3A_Linear_algebra_and_computing/5.01%3A_Gaussian_elimination_revisited
    In this section, we revisit Gaussian elimination and explore some problems with implementing it in the straightforward way that we described back in Section 1.2. In particular, we will see how the fac...In this section, we revisit Gaussian elimination and explore some problems with implementing it in the straightforward way that we described back in Section 1.2. In particular, we will see how the fact that computers only approximate arithmetic operations can lead us to find solutions that are far from the actual solutions. Second, we will explore how much work is required to implement Gaussian elimination and devise a more efficient means of implementing it.

Support Center

How can we help?