# 17.2: Review Problems

- Page ID
- 2106

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

1. Let \(L:U \Rightarrow V\) be a linear transformation. Suppose \(v \in L(U)\) and you have found a vector \(u_{ps}\) that obeys \(L(u_{ps})\) = v.

Explain why you need to compute \(ker L\) to describe the solution set of the linear system \(L(u) = v\).

2. Suppose that \(M\) is an \(m \times n\) matrix with trivial kernel. Show that for any vectors \(u\) and \(v\) in \(\mathbb{R}^{m}\).

a) \(u^{T}M^{T}Mv = v^{T}M^{T}Mu\).

b) \(v^{T}M^{T}Mv \geq 0\). In case you are concerned (you don't need to be) and for future reference, the notation \(v \geq 0\) means each component \(v^{i} \geq 0\).

(\(\textit{Hint:}\) Think about the dot product in \(\mathbb{R}^{n}\).)

3. Rewrite the Gram-Schmidt algorithm in terms of projection matrices.

4. Show that if \(v_{1}, \cdots , V_{k}\) are linearly independent that the matrix \(M = (v_{1} \cdots v_{k})\) is not necessarily invertible but the matrix \(M^{T}M\) is invertible.

## Contributor

David Cherney, Tom Denton, and Andrew Waldron (UC Davis)