Skip to main content
Mathematics LibreTexts

14.7: Review Problems

  • Page ID
    2092
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    1. Let \(D = \begin{pmatrix}\lambda_{1} & 0 \\ 0 & \lambda_{2}\end{pmatrix}\)

    a) Write \(D\) in terms of the vectors \(e_{1}\) and \(e_{2}\), and their transposes.

    b) Suppose \(P = \begin{pmatrix} a & b \\ c & d \end{pmatrix}\) is invertible. Show that \(D\) is similar to

    \[M = \frac{1}{ad - bc}\begin{pmatrix}\lambda_{1}ad - \lambda_{2}bc & -(\lambda_{1} - \lambda_{2})ab \\ (\lambda_{1} - \lambda_{2})cd & -\lambda{1}bc + \lambda_{2}ad\end{pmatrix}\]

    c) Suppose the vectors \((a,b)\) and \((c,d)\) are orthogonal. What can you say about \(M\) in this case? (\(\textit{Hint:}\) think about what \(M^{T}\) is equal to.)

    2. Suppose \(S = {v_{1},...,v_{n}}\) is an \(\textit{orthogonal}\) (not orthonormal) basis for \(\mathbb{R}^{n}\). Then we can write any vector \(v\) as \(v = \sum_{i} c^{i}v_{i}\) for some constants \(c^{i}\). Find a formula for the constants \(c^{i}\) in terms of \(v\) and the vectors in \(S\).

    3. Let \(u, v\) be linearly independent vectors in \(\mathbb{R}^{3}\), and \(P = span{u, v}\) be the plane spanned by \(u\) and \(v\).

    (a) Is the vector \(v^{\perp} := v - \frac{u \cdot v}{u \cdot u}u\) in the plane \(P\)?

    (b) What is the (cosine of the) angle between \(v^{\perp}\) and \(u^{\perp}\)?

    (c) How can you find a third vector perpendicular to both \(u\) and \(v^{\perp}\)?

    (d) Construct an orthonormal basis for \(\mathbb{R}^{3}\) from \(u\) and \(v\).

    (e) Test your abstract formulæ starting with \(u = (1,2,0)\) and \(v = (0,1,1)\).

    4. Find an orthonormal basis for \(\mathbb{R}^{4}\) which includes \((1,1,1,1)\) using the following procedure:

    (a) Pick a vector perpendicular to the vector

    \[v_{1} = \begin{pmatrix}1\\1\\1\\1\end{pmatrix}\]

    from the solution set of the matrix equation

    \[v_{1}^{T}x = 0.\]

    Pick the vector \(v_{2}\) obtained from the standard Gaussian elimination procedure which is the coefficient of \(x_{2}\).

    (b) Pick a vector perpendicular to both \(v_{1}\) and \(v_{2}\) from the solutions set of the matrix equation

    \[\begin{pmatrix}v_{1}^{T} \\ v_{2}^{T}\end{pmatrix}x = 0.\]

    Pick the vector \(v_{3}\) obtained from the standard Gaussian elimination procedure with \(x_{3}\) as the coefficient.

    (c) Pick a vector perpendicular to \(v_{1}, v_{2}\), and \(v_{3}\) from the solution set of the matrix equation

    \[\begin{pmatrix}v_{1}^{T} \\ v_{2}^{T} \\ v_{3}^{T}\end{pmatrix}x = 0.

    Pick the vector \(v_{4}\) obtained from the standard Gaussian elimination procedure with \(x_{3}\) as the coefficient.

    (d) Normalize the four vectors obtained above.

    5. Use the inner product

    \[f \cdot g := \int_{0}^{1} f(x)g(x)dx\]

    on the vector space \(V =span{1,x,x^{2},x^{3}}\) to perform the Gram-Schmidt procedure on the set of vectors \({1,x,x^{2},x^{3}}\)

    6. Use the inner product on the vector space \(V = span{sin(x), sin(2x), sin(3x)}\) to perform the Gram-Schmidt procedure on the set of vectors \({sin(x), sin(2x), sin(3x)}\).

    What do you suspect about the vector space \(span{sin(nx) | n \in N}\)?

    What do you suspect about the vector space \(span{sin(ax) | a \in R}\)?

    7.

    1. Show that if \(Q\) is an orthogonal \(n \times n\) matrix then $$u \cdot v = (Qu) \cdot (Qv),$$ for any \(u, v \in \mathbb{R}^{n}\). That is, \(Q\) preserves the inner product.
    2. Does \(Q\) preserve the outer product?
    3. If \({u_{1},...,u_{n}}\) is an orthonormal set and \({\lambda_{1},··· , \lambda_{n}}\) is a set of numbers then what are the eigenvalues and eigenvectors of the matrix \(M = \sum^{n}_{i=1} \lambda_{i}u_{i}u^{T}_{i}\)?
    4. How does \(Q\) change this matrix? How do the eigenvectors and eigenvalues change?

    8. Carefully write out the Gram-Schmidt procedure for the set of vectors $$\begin{Bmatrix}\begin{pmatrix}1 \\1 \\1\end{pmatrix}, \begin{pmatrix}1 \\-1 \\1\end{pmatrix}, \begin{pmatrix}1 \\1 \\-1\end{pmatrix}\end{Bmatrix}.$$ Are you free to rescale the second vector obtained in the procedure to a vector with integer components?

    9.

    a) Suppose \(u\) and \(v\) are linearly independent. Show that \(u\) and \(v^{\perp}\) are also linearly independent. Explain why \({u, v^{\perp}}\) is a basis for \(span{u,v}\).

    b) Repeat the previous problem, but with three independent vectors \(u, v, w\).

    10. Find the \(QR\) factorization of $$M = \begin{pmatrix}1&0&2\\-1&2&0\\-1&-2&2\end{pmatrix}.\]

    11. Given any three vectors \(u, v, w\), when do \(v^{\perp}\) or \(w^{\perp}\) of the Gram-Schmidt procedure vanish?

    12. For \(U\) a subspace of \(W\), use the subspace theorem to check that \(U^{\perp}\) is a subspace of \(W\).

    13. Let \(S_{n}\) and \(A_{n}\) define the space of \(n \times n\) symmetric and anti-symmetric matrices respectively. These are subspaces of the vector space \(M^{n}_{n}\) of all \(n \times n\) matrices. What is \(dim M_{n}^{n}\), \(dim S_{n}\) and \(dim A_{n}\)? Show that \(M^{n}_{n} = S_{n} + A_{n}\). Is \(A^{\perp}_{n} = S_{n}\)? Is \(M^{n}_{n} = S_{n} \oplus A_{n}\)?

    14. The vector space \(V = span{sin(t), sin(2t), sin(3t)}\) has an inner product: $$f \cdot g := \int_{0}^{2\pi} f(t)g(t)dt.$$ Find the orthogonal compliment to \(U = span{sin(t) + sin(2t)}\) in \(V\). Express \(sin(t) - sin(2t)\) as the sum of vectors from \(U\) and \(U^{T}\).

    Contributor


    This page titled 14.7: Review Problems is shared under a not declared license and was authored, remixed, and/or curated by David Cherney, Tom Denton, & Andrew Waldron.

    • Was this article helpful?