4: Rⁿ
( \newcommand{\kernel}{\mathrm{null}\,}\)
- 4.1: Review of Vectors
- This page covers the foundational concepts of vectors in Rn, including position vectors, vector operations (addition, subtraction, scalar multiplication), and geometric interpretations. It explains distance between points using the Pythagorean theorem, introduces unit vectors and their calculation, and discusses linear combinations of vectors.
- 4.2: Dot and Cross Product
- There are two ways of multiplying vectors which are of great importance in applications. The first of these is called the dot product. When we take the dot product of vectors, the result is a scalar. For this reason, the dot product is also called the scalar product and sometimes the inner product.
- 4.3: Lines and Planes
- We can use the concept of vectors and points to find equations for arbitrary lines in Rn, although in this section the focus will be on lines in R3.
- 4.4: Spanning Sets in Rⁿ
- By generating all linear combinations of a set of vectors one can obtain various subsets of Rn which we call subspaces. For example what set of vectors in R3 generate the XY-plane? What is the smallest such set of vectors can you find? The tools of spanning, linear independence and basis are exactly what is needed to answer these and similar questions and are the focus of this section.
- 4.5: Linear Independence
- This section discusses the linear dependence and independence between vectors.
- 4.6: Subspaces and Bases
- The goal of this section is to develop an understanding of a subspace of Rn.
- 4.7: Row, Column and Null Spaces
- This section discusses the Row, Column, and Null Spaces of a matrix, focusing on their definitions, properties, and computational methods.
- 4.8: Orthogonal Vectors and Matrices
- In this section, we examine what it means for vectors (and sets of vectors) to be orthogonal and orthonormal. First, it is necessary to review some important concepts. You may recall the definitions for the span of a set of vectors and a linear independent set of vectors.
- 4.9: Gram-Schmidt Process
- The Gram-Schmidt process is an algorithm to transform a set of vectors into an orthonormal set spanning the same subspace, that is generating the same collection of linear combinations.
- 4.10: Orthogonal Projections
- An important use of the Gram-Schmidt Process is in orthogonal projections, the focus of this section.
- 4.11: Least Squares Approximation
- In this section, we discuss a very important technique derived from orthogonal projections: the least squares approximation.
Thumbnail: Animation showing how the vector cross product (green) varies when the angles between the blue and red vectors is changed. (Public Domain; Nicostella via Wikipedia)