7.4: Singular Value Decompositions
( \newcommand{\kernel}{\mathrm{null}\,}\)
The Spectral Theorem has animated the past few sections. In particular, we applied the fact that symmetric matrices can be orthogonally diagonalized to simplify quadratic forms, which enabled us to use principal component analysis to reduce the dimension of a dataset.
But what can we do with matrices that are not symmetric or even square? For instance, the following matrices are not diagonalizable, much less orthogonally so:
In this section, we will develop a description of matrices called the singular value decomposition that is, in many ways, analogous to an orthogonal diagonalization. For example, we have seen that any symmetric matrix can be written in the form
Preview Activity 7.4.1.
Let's review orthogonal diagonalizations and quadratic forms as our understanding of singular value decompositions will rely on them.
- Suppose that
is any matrix. Explain why the matrix is symmetric. - Suppose that
Find the matrix and write out the quadratic form as a function of and - What is the maximum value of
and in which direction does it occur? - What is the minimum value of
and in which direction does it occur? - What is the geometric relationship between the directions in which the maximum and minimum values occur?
Finding singular value decompositions
We will begin by explaining what a singular value decomposition is and how we can find one for a given matrix
Recall how the orthogonal diagonalization of a symmetric matrix is formed: if
A general matrix, particularly a matrix that is not square, may not have eigenvalues and eigenvectors, but we can discover analogous features, called singular values and singular vectors, by studying a function somewhat similar to a quadratic form. More specifically, any matrix
which measures the length of
While
We call
This is important in the next activity, which introduces singular values and singular vectors.
Activity 7.4.2.
The following interactive figure will help us explore singular values and vectors geometrically before we begin a more algebraic approach. This figure is also available at gvsu.edu/s/0YE.
The four sliders at the top of this figure enable us to choose a
Select the matrix
- The first singular value
is the maximum value of and an associated right singular vector is a unit vector describing a direction in which this maximum occurs.Use the diagram to find the first singular value
and an associated right singular vector - The second singular value
is the minimum value of and an associated right singular vector is a unit vector describing a direction in which this minimum occurs.Use the diagram to find the second singular value
and an associated right singular vector - Here's how we can find the right singular values and vectors without using the diagram. Remember that
where is the Gram matrix associated to Since is symmetric, it is orthogonally diagonalizable. Find and an orthogonal diagonalization of it. What is the maximum value of the quadratic form among all unit vectors and in which direction does it occur? What is the minimum value of and in which direction does it occur? - Because
the first singular value will be the square root of the maximum value of and the square root of the minimum. Verify that the singular values that you found from the diagram are the square roots of the maximum and minimum values of - Verify that the right singular vectors
and that you found from the diagram are the directions in which the maximum and minimum values occur. - Finally, we introduce the left singular vectors
and by requiring that and Find the two left singular vectors. - Form the matrices
and explain why
- Finally, explain why
and verify that this relationship holds for this specific example.
As this activity shows, the singular values of
We will find a singular value decomposition of the matrix
We begin by constructing the Gram matrix
We now know that the maximum value of the quadratic form
In the same way, we also know that the second singular value
The first left singular vector
In the same way, the second left singular vector is defined by
We then construct
We now have
Because the right singular vectors, the columns of
To summarize, we find a singular value decomposition of a matrix
- Construct the Gram matrix
and find an orthogonal diagonalization to obtain eigenvalues and an orthonormal basis of eigenvectors. - The singular values of
are the squares roots of eigenvalues of that is, For reasons we'll see in the next section, the singular values are listed in decreasing order: The right singular vectors are the associated eigenvectors of - The left singular vectors
are found by Because we know that will be a unit vector.In fact, the left singular vectors will also form an orthonormal basis. To see this, suppose that the associcated singular values are nonzero. We then have:
since the right singular vectors are orthogonal.
Let's find a singular value decomposition for the symmetric matrix
which has an orthogonal diagonalization with
This gives singular values and vectors
and the singular value decomposition
This example is special because
Activity 7.4.3.
In this activity, we will construct the singular value decomposition of
- Construct the Gram matrix
and find an orthogonal diagonalization of it. - Identify the singular values of
and the right singular vectors and What is the dimension of these vectors? How many nonzero singular values are there? - Find the left singular vectors
and using the fact that What is the dimension of these vectors? What happens if you try to find a third left singular vector in this way? - As before, form the orthogonal matrices
and from the left and right singular vectors. What are the dimensions of and How do these dimensions relate to the number of rows and columns of - Now form
so that it has the same shape asand verify that
- How can you use this singular value decomposition of
to easily find a singular value decomposition of
We will find a singular value decomposition of the matrix
Finding an orthogonal diagonalization of
which gives singular values
We now find
Notice that it's not possible to find a third left singular vector since
which gives the singular value decomposition
Notice that
As we'll see in the next section, some additional work may be needed to construct the left singular vectors
Theorem 7.4.5. The singular value decomposition.
AnNotice that a singular value decomposition of
If
As we said earlier, the singular value decomposition should be thought of a generalization of an orthogonal diagonalization. For instance, the Spectral Theorem tells us that a symmetric matrix can be written as
The structure of singular value decompositions
Now that we have an understanding of what a singular value decomposition is and how to construct it, let's explore the ways in which a singular value decomposition reveals the underlying structure of the matrix. As we'll see, the matrices
Activity 7.4.4.
Let's suppose that a matrix
- What are the dimensions of
that is, how many rows and columns does have? - Suppose we write a three-dimensional vector
as a linear combination of right singular vectors:We would like to find an expression for
To begin,
Now
And finally,
To summarize, we have
What condition on
and must be satisfied if is a solution to the equation Is there a unique solution or infinitely many? - Remembering that
and are linearly independent, what condition on and must be satisfied if - How do the right singular vectors
provide a basis for the subspace of solutions to the equation - Remember that
is in if the equation is consistent, which means thatfor some coefficients
and How do the left singular vectors provide an orthonormal basis for - Remember that
is the dimension of the column space. What is and how do the number of nonzero singular values determine
This activity shows how a singular value decomposition of a matrix encodes important information about its null and column spaces. This is, in fact, the key observation that makes singular value decompositions so useful: the left and right singular vectors provide orthonormal bases for
Suppose we have a singular value decomposition
As in the activity, if
If
which says that
Remembering that
Moreover, if
which implies that
More generally, if
Generally speaking, if the rank of an
The first
and the last
In fact, we can say more. Remember that Proposition 7.4.6 says that
For any matrix
If we have a singular value decomposition of an
This reflects the familiar fact that
In the same way,
Considered altogether, the subspaces
Theorem 7.4.9.
Suppose
is the number of nonzero singular values.- The columns
form an orthonormal basis for - The columns
form an orthonormal basis for - The columns
form an orthonormal basis for - The columns
form an orthonormal basis for
When we previously outlined a procedure for finding a singular decomposition of an
We won't worry about this issue too much, however, as we will frequently use software to find singular value decompositions for us.
Reduced singular value decompositions
As we'll see in the next section, there are times when it is helpful to express a singular value decomposition in a slightly different form.
Activity 7.4.5.
Suppose we have a singular value decomposition
- What are the dimensions of
What is - Identify bases for
and - Explain why
- Explain why
- If
explain why where the columns of are an orthonormal basis for is a diagonal, invertible matrix, and the columns of form an orthonormal basis for
We call this a reduced singular value decomposition.
If
is an matrix whose columns form an orthonormal basis for is an diagonal, invertible matrix, and is an matrix whose columns form an orthonormal basis for
In Example 7.4.4, we found the singular value decomposition
Since there are two nonzero singular values,
Summary
This section has explored singular value decompositions, how to find them, and how they organize important information about a matrix.
- A singular value decomposition of a matrix
is a factorization where The matrix has the same shape as and its only nonzero entries are the singular values of which appear in decreasing order on the diagonal. The matrices and are orthogonal and contain the left and right singular vectors. - To find a singular value decomposition of a matrix, we construct the Gram matrix
which is symmetric. The singular values of are the square roots of the eigenvalues of and the right singular vectors are the associated eigenvectors of The left singular vectors are determined from the relationship - A singular value decomposition organizes fundamental information about a matrix. For instance, the number of nonzero singular values is the rank
of the matrix. The first left singular vectors form an orthonormal basis for with the remaining left singular vectors forming an orthonormal basis of The first right singular vectors form an orthonormal basis for while the remaining right singular vectors form an orthonormal basis of - If
is a rank matrix, we can write a reduced singular value decomposition as where the columns of form an orthonormal basis for the columns of form an orthonormal basis for and is an diagonal, invertible matrix.
Exercises 7.4.5Exercises
Consider the matrix
- Find the Gram matrix
and use it to find the singular values and right singular vectors of - Find the left singular vectors.
- Form the matrices
and and verify that - What is
and what does this say about - Determine an orthonormal basis for
Find singular value decompositions for the following matrices:
Consider the matrix
- Find a singular value decomposition of
and verify that it is also an orthogonal diagonalization of - If
is a symmetric, positive semidefinite matrix, explain why a singular value decomposition of is an orthogonal diagonalization of
Suppose that the matrix
- What are the dimensions of
- What is
- Find orthonormal bases for
and - Find the orthogonal projection of
onto
Consider the matrix
- Construct the Gram matrix
and use it to find the singular values and right singular vectors and of What are the matrices and in a singular value decomposition? - What is
- Find as many left singular
as you can using the relationship - Find an orthonormal basis for
and use it to construct the matrix so that - State an orthonormal basis for
and an orthonormal basis for
Consider the matrix
- Use your result from Exercise 7.4.5.1 to find a singular value decomposition of
- What is
Determine a basis for and - Suppose that
Use the bases you found in the previous part of this exericse to write where is in and is in - Find the least squares approximate solution to the equation
Suppose that
- If
is invertible, find a singular value decomposition of - What condition on the singular values must hold for
to be invertible? - How are the singular values of
and the singular values of related to one another? - How are the right and left singular vectors of
related to the right and left singular vectors of
- If
is an orthogonal matrix, remember that Explain why - If
is a singular value decomposition of a square matrix explain why is the product of the singular values of - What does this say about the singular values of
if is invertible?
If
- For a general matrix
explain why the eigenvalues of are nonnegative. - Given a symmetric matrix
having an eigenvalue explain why is an eigenvalue of - If
is symmetric, explain why the singular values of equal the absolute value of its eigenvalues:
Determine whether the following statements are true or false and explain your reasoning.
- If
is a singular value decomposition of then is an orthogonal diagonalization of its Gram matrix. - If
is a singular value decomposition of a rank 2 matrix then and form an orthonormal basis for the column space - If
is a diagonalizable matrix, then its set of singular values is the same as its set of eigenvalues. - If
is a matrix and then the columns of are linearly independent. - The Gram matrix is always orthogonally diagonalizable.
Suppose that
- If you know that the columns of
are linearly independent, what more can you say about the form of - If you know that the columns of
span what more can you say about the form of - If you know that the columns of
are linearly independent and span what more can you say about the form of


