$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

# 16.2: Review Problems

$$\newcommand{\vecs}{\overset { \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

1.
Consider an arbitrary matrix $$M:\Re^{m}\to \Re^{n}.$$

a) Argue that $$Mx=0$$ if only if $$x$$ is perpendicular to all columns of $$M^{T}$$.

b) Argue that $$Mx=0$$ if only if $$x$$ is perpendicular to all of the linear combinations of the columns of $$M^{T}$$.

c) Argue that $$\ker M$$ is perpendicular to $${\rm ran} \,M^{T}$$.

d) Argue further $$\Re^{m}=\ker M \oplus {\rm ran}\, M^{T}$$.

e) Argue analogously that $$\Re^{n}=\ker M^{T} \oplus {\rm ran}\, M$$.

The equations in the last two parts describe how a linear transformation $$M:\Re^{m}\to \Re^{n}$$ determines orthogonal decompositions of both it's domain and target. This result sometimes goes by the humble name $$\textit{The Fundamental Theorem of Linear Algebra}$$.

2. Let $$L \colon V\rightarrow W$$ be a linear transformation. Show that $$\ker L=\{0_{V}\}$$ if and only if $$L$$ is one-to-one:

a) $$\textit{(Trivial kernel \(\Rightarrow$$ injective.)}\) Suppose that $$\ker L=\{0_V\}$$. Show that $$L$$ is one-to-one. Think about methods of proof--does a proof by contradiction, a proof by induction, or a direct proof seem most appropriate?

b) $$\textit{(Injective \(\Rightarrow$$ trivial kernel.)}\) Now suppose that $$L$$ is one-to-one. Show that $$\ker L=\{0_{V}\}$$. That is, show that $$0_{V}$$ is in $$\ker L$$, and then show that there are no other vectors in $$\ker L$$.

3. Let $$\{v_{1}, \ldots, v_{n} \}$$ be a basis for $$V$$. Carefully explain why
$L(V)= span \{Lv_{1}, \ldots, Lv_{n} \}.$

4. Suppose $$L \colon \Re^{4}\rightarrow \Re^{3}$$ whose matrix $$M$$ in the standard basis is row equivalent to the following matrix:
$\begin{pmatrix} 1&0&0&\!-1\\ 0&1&0&1\\ 0&0&1&1\\ \end{pmatrix}={\rm RREF}(M)\sim M.$

a) $$\textit{Explain}$$ why the first three columns of the original matrix $$M$$ form a basis for $$L(\Re^{4})$$.

b) $$\textit{Find and describe}$$ an algorithm ($$\textit{i.e.}$$, a general procedure) for computing a basis for $$L(\Re^{n})$$ when $$L \colon \Re^{n}\rightarrow \Re^{m}$$.

c) $$\textit{Use}$$ your algorithm to find a basis for $$L(\Re^{4})$$ when $$L \colon \Re^{4} \to \Re^{3}$$ is the linear transformation whose matrix $$M$$ in the standard basis is
$\begin{pmatrix} 2&1&1&4\\ 0&1&0&5\\ 4&1&1&6\\ \end{pmatrix}.$

5. Claim:
If $$\{v_{1}, \ldots, v_{n} \}$$ is a basis for $$\ker L$$, where $$L \colon V\rightarrow W$$, then it is always possible to extend this set to a basis for $$V$$.
Choose some simple yet non-trivial linear transformations with non-trivial kernels and verify the above claim for those transformations.

6. Let $$P_{n}(x)$$ be the space of polynomials in $$x$$ of degree less than or equal to $$n$$, and consider the derivative operator $$\frac{d}{dx}:P_{n}(x)\to P_{n}(x)\, .$$
Find the dimension of the kernel and image of this operator. What happens if the target space is changed to $$P_{n-1}(x)$$ or $$P_{n+1}(x)$$?

Now consider $$P_{2}(x,y)$$, the space of polynomials of degree two or less in $$x$$ and $$y$$. (Recall how degree is counted; $$xy$$ is degree two, $$y$$ is degree one and $$x^{2}y$$ is degree three, for example.) Let
$$L:=\frac{\partial}{\partial x}+\frac{\partial}{\partial y}:P_{2}(x,y) \to P_{2}(x,y).$$
(For example, $$L(xy)=\frac{\partial}{\partial x}(xy)+\frac{\partial}{\partial y}(xy) = y+x$$.) Find a basis for the kernel of $$L$$. Verify the dimension formula in this case.

7.
Lets demonstrate some ways the dimension formula can break down if a vector space is infinite dimensional:

a) Let $$\mathbb{R}[x]$$ be the vector space of all polynomials in the variable $$x$$ with real coefficients. Let $$D = \frac{d}{dx}$$ be the usual derivative operator. Show that the range of $$D$$ is $$\mathbb{R}[x]$$. What is $$\ker D$$? $$\textit{Hint: Use the basis \({ x^{n} \mid n \in \mathbb{N} }.$$}\)

b) Let $$L \colon \mathbb{R}[x] \rightarrow \mathbb{R}[x]$$ be the linear map $$L(p(x)) = xp(x)\, .$$ What is the kernel and range of $$M$$?

c) Let $$V$$ be an infinite dimensional vector space and $$L \colon V \rightarrow V$$ be a linear operator. Suppose that $$\dim \ker L < \infty$$, show that $$\dim L(V)$$ is infinite. Also show that when $$\dim L(V) < \infty$$ that $$\dim \ker L$$ is infinite.

8. This question will answer the question, "If I choose a bit vector $$\textit{at random}$$, what is the probability that it lies in the span of some other vectors?''

[$$i.$$] Given a collection $$S$$ of $$k$$ bit vectors in $$B^{3}$$, consider the bit matrix $$M$$ whose columns are the vectors in $$S$$. Show that $$S$$ is linearly independent if and only if the kernel of $$M$$ is trivial, namely the set $${\rm ker}M=\{v\in B^{3}| \, Mv=0\}$$ contains only the zero vector.

[$$ii.$$] Give some method for choosing a random bit vector $$v$$ in $$B^{3}$$. Suppose $$S$$ is a collection of $$2$$ linearly independent bit vectors in $$B^{3}$$. How can we tell whether $$S\cup \{v\}$$ is linearly independent? Do you think it is likely or unlikely that $$S\cup \{v\}$$ is linearly independent? Explain your reasoning.

[$$iii.$$] If $$P$$ is the characteristic polynomial of a $$3 \times 3$$ bit matrix, what must the degree of $$P$$ be? Given that each coefficient must be either 0 or 1, how many possibilities are there for $$P$$? How many of these possible characteristic polynomials have $$0$$ as a root? If $$M$$ is a $$3 \times 3$$ bit matrix chosen at random, what is the probability that it has 0 as an eigenvalue? (Assume that you are choosing a random matrix $$M$$ in such a way as to make each characteristic polynomial equally likely.) What is the probability that the columns of $$M$$ form a basis for $$B^{3}$$? $$(\textit{Hint: what is the relationship between the kernel of \(M$$ and its eigenvalues?})\)

[Note:] We could ask the same question for real vectors: If I choose a real vector at random, what is the probability that it lies in the span of some other vectors? In fact, once we write down a reasonable way of choosing a random real vector, if I choose a real vector in $$\Re^{n}$$ at random, the probability that it lies in the span of $$n-1$$ other real vectors is zero!