Skip to main content
Mathematics LibreTexts

14.5: QR Decomposition

  • Page ID
    2089
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    In chapter 7, section 7.7 we learned how to solve linear systems by decomposing a matrix \(M\) into a product of lower and upper triangular matrices

    \[M=LU\, .\]

    The Gram-Schmidt procedure suggests another matrix decomposition,

    \[M=QR\, ,\]

    where \(Q\) is an orthogonal matrix and \(R\) is an upper triangular matrix. So-called QR-decompositions are useful for solving linear systems, eigenvalue problems and least squares approximations. You can easily get the idea behind the \(QR\) decomposition by working through a simple example.

    Example \(\PageIndex{1}\):

    Find the \(QR\) decomposition of

    \[M=\begin{pmatrix}2&-1&1\\1&3&-2\\0&1&-2\end{pmatrix}\, .\]

    What we will do is to think of the columns of \(M\) as three 3-vectors and use Gram-Schmidt to build an orthonormal basis from these that will become the columns of the orthogonal matrix \(Q\). We will use the matrix \(R\) to record the steps of the Gram-Schmidt procedure in such a way that the product \(QR\) equals \(M\).

    To begin with we write

    \[
    M=\begin{pmatrix}2&-\frac{7}{5}&1\\1&\frac{14}{5}&-2\\0&1&-2\end{pmatrix}
    \begin{pmatrix}1&\frac{1}{5}&0\\0&1&0\\0&0&1\end{pmatrix}\, .
    \]

    In the first matrix the first two columns are orthogonal because we simply replaced the second column of \(M\) by the vector that the Gram-Schmidt procedure produces from the first two columns of \(M\), namely

    \[
    \begin{pmatrix}-\frac{7}{5}\\\frac{14}{5}\\1\end{pmatrix}=\begin{pmatrix}-1\\3\\1\end{pmatrix}-\frac{1}{5}
    \begin{pmatrix} 2 \\1\\0\end{pmatrix}\, .
    \]

    The matrix on the right is almost the identity matrix, save the \(+\frac{1}{5}\) in the second entry of the first row, whose effect upon multiplying the two matrices precisely undoes what we we did to the second column of the first matrix. For the third column of \(M\) we use Gram--Schmidt to deduce the third orthogonal vector

    \[\begin{pmatrix}-\frac{1}{6}\\\frac{1}{3}\\-\frac{7}{6}\end{pmatrix}=\begin{pmatrix}1\\-2\\-2\end{pmatrix}-0\begin{pmatrix}2\\1\\0\end{pmatrix}-\frac{-9}{\frac{54}{5}}\begin{pmatrix}-\frac{7}{5}\\\frac{14}{5}\\1\end{pmatrix}\, \]

    and therefore, using exactly the same procedure write

    \[
    M=\begin{pmatrix}2&-\frac{7}{5}&-\frac{1}{6}\\1&\frac{14}{5}&\frac{1}{3}\\0&1&-\frac{7}{6}\end{pmatrix}
    \begin{pmatrix}1&\frac{1}{5}&0\\0&1&-\frac{5}{6}\\0&0&1\end{pmatrix}\, .
    \]

    This is not quite the answer because the first matrix is now made of mutually orthogonal column vectors, but a \(\textit{bona fide}\) orthogonal matrix is comprised of orthonormal vectors. To achieve that we divide each column of the first matrix by its length and multiply the corresponding row of the second matrix by the same amount:

    \[
    M=\begin{pmatrix}\frac{2\sqrt{5}}{5}&-\frac{7\sqrt{30}}{90}&-\frac{\sqrt{6}}{18}\\
    \frac{\sqrt{5}}{5}&\frac{7\sqrt{30}}{45}&\frac{\sqrt{6}}{9}\\
    0&\frac{\sqrt{30}}{18}&-\frac{7\sqrt{6}}{18}\end{pmatrix}
    \begin{pmatrix}\sqrt{5}&\frac{\sqrt{5}}{5}&0\\
    0&\frac{3\sqrt{30}}{5}&-\frac{\sqrt{30}}{2}\\
    0&0&\frac{\sqrt{6}}{2}\end{pmatrix}=QR\, .
    \]

    A nice check of this result is to verify that entry \((i,j)\) of the matrix \(R\) equals the dot product of the \(i\)-th column of \(Q\) with the \(j\)-th column of \(M\). (Some people memorize this fact and use it as a recipe for computing \(QR\) deompositions.) \(\textit{A good test of your own understanding is to work out why this is true!}\)

    Contributor

    This page titled 14.5: QR Decomposition is shared under a not declared license and was authored, remixed, and/or curated by David Cherney, Tom Denton, & Andrew Waldron.