Skip to main content
\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)
Mathematics LibreTexts

14.5: \(QR\) Decomposition

In chapter 7, section 7.7 we learned how to solve linear systems by decomposing a matrix \(M\) into a product of lower and upper triangular matrices

\[M=LU\, .\]

The Gram--Schmidt procedure suggests another matrix decomposition,

\[M=QR\, ,\]

where \(Q\) is an orthogonal matrix and \(R\) is an upper triangular matrix. So-called QR-decompositions are useful for solving linear systems, eigenvalue problems and least squares approximations.  You can easily get the idea behind the \(QR\) decomposition by working through a simple example.


Example 125

Find the \(QR\) decomposition of

\[M=\begin{pmatrix}2&-1&1\\1&3&-2\\0&1&-2\end{pmatrix}\, .\]

What we will do is to think of the columns of \(M\) as three 3-vectors and use Gram--Schmidt to build an orthonormal basis from these that will become the columns of the orthogonal matrix \(Q\).  We will use the matrix \(R\) to record the steps of the Gram--Schmidt procedure in such a way that the product \(QR\) equals \(M\). 

To begin with we write

\begin{pmatrix}1&\frac{1}{5}&0\\0&1&0\\0&0&1\end{pmatrix}\, .

In the first matrix the first two columns are orthogonal because we simply replaced the second column of \(M\) by the vector that the Gram--Schmidt procedure produces from the first two columns of \(M\), namely

\begin{pmatrix} 2 \\1\\0\end{pmatrix}\, .

The matrix on the right is almost the identity matrix, save the \(+\frac{1}{5}\) in the second entry of the first row, whose effect upon multiplying the two matrices precisely undoes what we we did to the second column of the first matrix. For the third column of \(M\) we use Gram--Schmidt to deduce the third orthogonal vector

$$\begin{pmatrix}-\frac{1}{6}\\\frac{1}{3}\\-\frac{7}{6}\end{pmatrix}=\begin{pmatrix}1\\-2\\-2\end{pmatrix}-0\begin{pmatrix}2\\1\\0\end{pmatrix}-\frac{-9}{\frac{54}{5}}\begin{pmatrix}-\frac{7}{5}\\\frac{14}{5}\\1\end{pmatrix}\, $$

and therefore, using exactly the same procedure write

\begin{pmatrix}1&\frac{1}{5}&0\\0&1&-\frac{5}{6}\\0&0&1\end{pmatrix}\, .

This is not quite the answer because the first matrix is now made of mutually orthogonal column vectors, but a \(\textit{bona fide}\) orthogonal matrix is comprised of \(\textit{orthonormal}\) vectors.  To achieve that we divide each column of the first matrix by its length and multiply the corresponding row of the second matrix by the same amount:

0&0&\frac{\sqrt{6}}{2}\end{pmatrix}=QR\, .

A nice check of this result is to verify that entry \((i,j)\) of the matrix \(R\) equals the dot product of the \(i\)-th column of \(Q\) with the \(j\)-th column of \(M\). (Some people memorize this fact and use it as a recipe for computing \(QR\) deompositions.) \(\textit{A good test of your own understanding is to work out why this is true!}\)