Skip to main content
Mathematics LibreTexts

8.6E: The Singular Value Decomposition Exercises

  • Page ID
    132842
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    Exercises for 1

    solutions

    2

    If \(ACA=A\) show that \(B=CAC\) is a middle inverse for \(A\).

    For any matrix \(A\) show that

    \[\Sigma_{A^{T}}=(\Sigma_{A})^{T} \nonumber \]

    If \(A\) is \(m\times n\) with all singular values positive, what is \(rank \;A\)?

    If \(A\) has singular values \(\sigma_{1},\dots ,\sigma_{r}\), what are the singular values of:

    \(A^{T}\) \(tA\) where \(t>0\) is real \(A^{-1}\) assuming \(A\) is invertible.

    1. \(t\sigma _{1},\dots ,t\sigma _{r}.\)

    If \(A\) is square show that \(| \det A |\) is the product of the singular values of \(A\).

    If \(A\) is square and real, show that \(A=0\) if and only if every eigenvalue of \(A^T A\) is \(0\).

    Given a SVD for an invertible matrix \(A\), find one for \(A^{-1}\). How are \(\Sigma_{A}\) and \(\Sigma_{A^{-1}}\) related?

    If \(A=U\Sigma V^{T}\) then \(\Sigma\) is invertible, so \(A^{-1}=V\Sigma^{-1}U^{T}\) is a SVD.

    Let \(A^{-1}=A=A^{T}\) where \(A\) is \(n\times n\). Given any orthogonal \(n\times n\) matrix \(U\), find an orthogonal matrix \(V\) such that \(A=U\Sigma_{A}V^{T}\) is an SVD for \(A\).

    If \(A= \left[ \begin{array}{cc} 0 & 1 \\ 1 & 0 \end{array} \right]\) do this for:

    \(U=\frac{1}{5} \left[ \begin{array}{rr} 3 & -4 \\ 4 & 3 \end{array} \right]\) \(U=\frac{1}{\sqrt{2}} \left[ \begin{array}{rr} 1 & -1 \\ 1 & 1 \end{array} \right]\)

    1. First \(A^{T}A=I_{n}\) so \(\Sigma _{A}=I_{n}\).

      \[\begin{aligned} A &=& \frac{1}{\sqrt{2}}\left[ \begin{array}{rr} 1 & 1 \\ 1 & -1 \end{array} \right] \left[ \begin{array}{rr} 1 & 0 \\ 0 & 1 \end{array} \right] \frac{1}{\sqrt{2}} \left[ \begin{array}{rr} 1 & 1 \\ -1 & 1 \end{array} \right] \\ & =&\frac{1}{\sqrt{2}}\left[ \begin{array}{rr} 1 & -1 \\ 1 & 1 \end{array} \right] \frac{1}{\sqrt{2}} \left[ \begin{array}{rr} -1 & 1 \\ 1 & 1 \end{array} \right] \\ & =& \left[ \begin{array}{rr} -1 & 0 \\ 0 & 1 \end{array} \right] \end{aligned} \nonumber \]

    Find a SVD for the following matrices:

    \(A= \left[ \begin{array}{rr} 1 & -1 \\ 0 & 1 \\ 1 & 0 \end{array} \right]\) \(\left[ \begin{array}{rrr} 1 & 1 & 1 \\ -1 & 0 & -2 \\ 1 & 2 & 0 \end{array} \right]\)

    1. \[\hspace*{-5em} = \scriptsize \frac{1}{5}\left[ \begin{array}{rr} 3 & 4 \\ 4 & -3 \end{array} \right] \left[ \begin{array}{rrrr} 20 & 0 & 0 & 0 \\ 0 & 10 & 0 & 0 \end{array} \right] \frac{1}{2} \left[ \begin{array}{rrrr} 1 & 1 & 1 & 1 \\ 1 & -1 & 1 & -1 \\ 1 & 1 & -1 & -1 \\ 1 & -1 & 1 & -1 \end{array} \right] \nonumber \]

    Find an SVD for \(A= \left[ \begin{array}{rr} 0 & 1 \\ -1 & 0 \end{array} \right]\).

    If \(A=U\Sigma V^{T}\) is an SVD for \(A\), find an SVD for \(A^{T}\).

    Let \(A\) be a real, \(m\times n\) matrix with positive singular values \(\sigma_{1},\sigma_{2},\dots ,\sigma_{r}\), and write

    \[s(x)=(x-\sigma_{1})(x-\sigma_{2})\cdots (x-\sigma_{r}) \nonumber \]

    1. Show that \(c_{A^{T}A}(x)=s(x)x^{n-r}\) and \(c_{A^{T}A}(c)=s(x)x^{m-r}\).
    2. If \(m\leq n\) conclude that \(c_{A^{T}A}(x)=s(x)x^{n-m}\).

    If \(G\) is positive show that:

    1. \(rG\) is positive if \(r\geq 0\)
    2. \(G+H\) is positive for any positive \(H\).
    1. If \(\mathbf{x}\in \mathbb{R}^{n}\) then \(\mathbf{x}^{T}(G+H)\mathbf{x}=\mathbf{x}^{T}G\mathbf{x}+\mathbf{x}^{T}H\mathbf{x}\geq 0+0=0\).

    If \(G\) is positive and \(\lambda\) is an eigenvalue, show that \(\lambda \geq 0\).

    If \(G\) is positive show that \(G=H^{2}\) for some positive matrix \(H\). [Hint: Preceding exercise and Lemma [lem:svdlemma5]]

    If \(A\) is \(n\times n\) show that \(AA^{T}\) and \(A^{T}A\) are similar. [Hint: Start with an SVD for \(A\).]

    Find \(A^{+}\) if:

    1. \(A= \left[ \begin{array}{rr} 1 & 2 \\ -1 & -2 \end{array} \right]\)
    2. \(A= \left[ \begin{array}{rr} 1 & -1 \\ 0 & 0 \\ 1 & -1 \end{array} \right]\)
    1. \(\left[ \begin{array}{rrr} \frac{1}{4} & 0 & \frac{1}{4} \\ -\frac{1}{4} & 0 & -\frac{1}{4} \end{array} \right]\)

    Show that \((A^{+})^{T}=(A^{T})^{+}\).


    8.6E: The Singular Value Decomposition Exercises is shared under a not declared license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?