17.2: Review Problems
- Page ID
- 2106
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)
( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\id}{\mathrm{id}}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\kernel}{\mathrm{null}\,}\)
\( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\)
\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\)
\( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)
\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)
\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)
\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vectorC}[1]{\textbf{#1}} \)
\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)
\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)
\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
1. Let \(L:U \Rightarrow V\) be a linear transformation. Suppose \(v \in L(U)\) and you have found a vector \(u_{ps}\) that obeys \(L(u_{ps})\) = v.
Explain why you need to compute \(ker L\) to describe the solution set of the linear system \(L(u) = v\).
2. Suppose that \(M\) is an \(m \times n\) matrix with trivial kernel. Show that for any vectors \(u\) and \(v\) in \(\mathbb{R}^{m}\).
a) \(u^{T}M^{T}Mv = v^{T}M^{T}Mu\).
b) \(v^{T}M^{T}Mv \geq 0\). In case you are concerned (you don't need to be) and for future reference, the notation \(v \geq 0\) means each component \(v^{i} \geq 0\).
(\(\textit{Hint:}\) Think about the dot product in \(\mathbb{R}^{n}\).)
3. Rewrite the Gram-Schmidt algorithm in terms of projection matrices.
4. Show that if \(v_{1}, \cdots , V_{k}\) are linearly independent that the matrix \(M = (v_{1} \cdots v_{k})\) is not necessarily invertible but the matrix \(M^{T}M\) is invertible.
Contributor
David Cherney, Tom Denton, and Andrew Waldron (UC Davis)