# 15.1: Review Problems

1. (On Reality of Eigenvalues)

a) Suppose \(z=x+iy\) where \(x,y \in \Re, i=\sqrt{-1}\), and \(\overline{z}=x-iy\). Compute \(z\overline{z}\) and \(\overline{z}z\) in terms of \(x\) and \(y\). What kind of numbers are \(z \overline{z}\) and \(\overline{z}z\)? (The complex number \(\overline{z}\) is called the \(\textit{complex conjugate}\) of \(z\)).

b) Suppose that \(\lambda=x+iy\) is a complex number with \(x,y \in \Re\), and that \(\lambda=\overline{\lambda}\). Does this determine the value of \(x\) or \(y\)? What kind of number must \(\lambda\) be?

c) Let \(x=\begin{pmatrix}z^{1}\\ \vdots \\ z^{n}\end{pmatrix}\in C^{n}\). Let \(x^{\dagger}=\begin{pmatrix}\overline{z^{1}}& \cdots & \overline{z^{n}}\end{pmatrix} \in \mathbb{C}^{n}\) (a \(1 \times n\) complex matrix or a row vector). Compute \(x^{\dagger} x\). Using the result of part 1a, what can you say about the number \(x^{\dagger} x\)? (\(\textit{E.g.,}\) is it real, imaginary, positive, negative, etc.)

d) Suppose \(M=M^{T}\) is an \(n\times n\) symmetric matrix with real entries. Let \(\lambda\) be an eigenvalue of \(M\) with eigenvector \(x\), so \(Mx=\lambda x\). Compute:

\[\frac{x^{\dagger} Mx}{x^{\dagger} x}\]

e) Suppose \(\Lambda\) is a \(1\times 1\) matrix. What is \(\Lambda^{T}\)?

f) What is the size of the matrix \(x^{\dagger} Mx\)?

g) For any matrix (or vector) \(N\), we can compute \(\overline{N}\) by applying complex conjugation to each entry of \(N\). Compute \(\overline{(x^{\dagger})^{T}}\). Then compute \(\overline{(x^{\dagger M x})^{T}}\). Note that for matrices \(\overline{AB + C} = \overline{A} \overline{B} + \overline{C}\).

h) Show that \(\lambda=\overline{\lambda}\). Using the result of a previous part of this problem, what does this say about \(\lambda\)?

2. Let $$x_{1}=\begin{pmatrix}a\\ b \\ c\end{pmatrix}\, ,$$ where \(a^{2}+b^{2}+c^{2}=1\). Find vectors \(x_{2}\) and \(x_{3}\) such that \(\{x_{1},x_{2},x_{3}\}\) is an orthonormal basis for \(\Re^{3}\). What can you say about the matrix \(P\) whose columns are the vectors \(x_{1}\), \(x_{2}\) and \(x_{3}\) that you found?

3. Let \(V\ni v\neq0\) be a vector space, \({\rm dim} V=n\) and \(L:V\stackrel{\rm linear}{-\!\!-\!\!\!\longrightarrow}V\).

a) Explain why the list of vectors \((v,Lv,L^{2}v,\ldots,L^{n} v)\) is linearly dependent.

b) Explain why there exist scalars \(\alpha_{i}\) not all zero such that

$$\alpha_{0} v + \alpha_{1} L v+\alpha_{2} L^{2} v+\cdots + \alpha_{n} L^{n} v=0\, .$$

c) Let \(m\) be the largest integer such that \(\alpha_{m}\neq0\) and $$p(z)=\alpha_{0}+ \alpha_{1} z + \alpha_{2} z^{2}+\cdots + \alpha_{m} z^{n} z^{n}\, .$$

Explain why the polynomial \(p(z)\) can be written as

$$p(z)=\alpha_{m} (z-\lambda_{1})(z-\lambda_{2})\ldots(z-\lambda_{m})\, .$$

[Note that some of the roots \(\lambda_{i}\) could be complex.]

d) Why does the following equation hold

$$(L-\lambda_{1})(L-\lambda_{2})\ldots(L-\lambda_{m})v=0\, ?$$

e) Explain why one of the numbers \(\lambda_{i}\) (\(1\leq i\leq m\)) must be an eigenvalue of \(L\).

4. (Dimensions of Eigenspaces)

a) Let $$A=\begin{pmatrix}4 & 0 & 0 \\0 & 2 & -2 \\0 & -2 & 2 \\\end{pmatrix}\, .$$

Find all eigenvalues of \(A.\)

b) Find a basis for each eigenspace of \(A.\) What is the sum of the dimensions of the eigenspaces of \(A\)?

c) Based on your answer to the previous part, guess a formula for the sum of the dimensions of the eigenspaces of a real \(n \times n\) symmetric matrix. Explain why your formula must work for any real \(n \times n\) symmetric matrix.

5. If \(M\) is not square then it can not be symmetric. However, \(MM^{T}\) and \(M^{T}M) are symmetric, and therefore diagonalizable.

a) Is it the case that all of the eigenvalues of \(MM^{T}\) must also be eigenvalues of \(M^{T}M\)?

b) Given an eigenvector of \(MM^{T}\) how can you obtain an eigenvector of \(M^{T}M\)?

c) Let $$M=\begin{pmatrix}1&2\\3&3\\2&1\end{pmatrix}\, .$$

Compute an orthonormal basis of eigenvectors for both \(MM^{T}\) and \(M^{T}M\). If any of the eigenvalues for these two matrices agree,

choose an order for them and us it to help order your orthonormal bases. Finally, change the input and output bases for the matrix \(M\) to these ordered orthonormal bases. Comment on what you find. (\(\textit{Hint:}\) The result is called \(\textit{the Singular Value Decomposition Theorem}\).)

### Contributor

David Cherney, Tom Denton, and Andrew Waldron (UC Davis)