Skip to main content
\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)
Mathematics LibreTexts

17.2: Review Problems

1.    Let \(L:U \Rightarrow V\) be a linear transformation.  Suppose \(v \in L(U)\) and you have found a vector \(u_{ps}\) that obeys \(L(u_{ps})\) = v.

Explain why you need to compute \(ker L\) to describe the solution set of the linear system \(L(u) = v\).

 

 

2.    Suppose that \(M\) is an \(m \times n\) matrix with trivial kernel.  Show that for any vectors \(u\) and \(v\) in \(\mathbb{R}^{m}\).

a)    \(u^{T}M^{T}Mv = v^{T}M^{T}Mu\).

b)    \(v^{T}M^{T}Mv \geq 0\).  In case you are concerned (you don't need to be) and for future reference, the notation \(v \geq 0\) means each component \(v^{i} \geq 0\).

(\(\textit{Hint:}\) Think about the dot product in \(\mathbb{R}^{n}\).)

 

 

3.    Rewrite the Gram-Schmidt algorithm in terms of projection matrices.

 

 

4.    Show that if \(v_{1}, \cdots , V_{k}\) are linearly independent that the matrix \(M = (v_{1} \cdots v_{k})\) is not necessarily invertible but the matrix \(M^{T}M\) is invertible.

 

 

Contributor