7.1: Linear Transformations and Matrices
( \newcommand{\kernel}{\mathrm{null}\,}\)
Ordered, finite-dimensional, bases for vector spaces allows us to express linear operators as matrices.
Basis Notation
A basis allows us to efficiently label arbitrary vectors in terms of column vectors. Here is an example.
Example
Let
Given a particular vector and a basis, your job is to write that vector as a sum of multiples of basis elements. Here and arbitrary vector
The coefficients
The column vector
Next, lets consider a tautological example showing how to label column vectors in terms of column vectors:
Example
The vectors
are called the standard basis vectors of
It is natural to assign these the order:
To emphasize that we are using the standard basis we define the list (or ordered set)
and write
You should read this equation by saying:
"The column vector of the vector
Again, the first notation of a column vector with a subscript
by the corresponding scalar listed in the column and then summing these,
You should already try to write down the standard basis vectors for
The last example probably seems pedantic because column vectors are already just ordered lists of numbers and the basis notation has simply allowed us to "re-express'' these as lists of numbers. Of course, this objection does not apply to more complicated vector spaces like our first matrix example. Moreover, as we saw earlier, there are infinitely many other pairs of vectors in
Example
As functions of
Notice something important: there is no reason to say that
be the ordered basis. Note that for an unordered set we use the
As before we define
You might think that the numbers
Thus, to contrast, we have
Only in the standard basis
Based on the above example, you might think that our aim would be to find the "standard basis'' for any problem. In fact, this is far from the truth. Notice, for example that the vector
written in the standard basis
which was easy to calculate. But in the basis
which is actually a simpler column vector! The fact that there are many bases for any given vector space allows us to choose a basis in which our computation is easiest. In any case, the standard basis only makes sense for
Example
Lets again consider the hyperplane
One possible choice of ordered basis is
With this choice
With the other choice of order
We see that the order of basis elements matters.
Finding the column vector of a given vector in a given basis usually amounts to a linear systems problem:
Example
Let
be the vector space of trace-free complex-valued matrices (over
These three matrices are the famous
Let
Find the column vector of
For this we must solve the equation
This gives three equations,
with solution
Hence
To summarize, the
is defined by solving the linear systems problem
The numbers
From Linear Operators to Matrices
Chapter 6 showed that linear functions are very special kinds of functions; they are fully specified by their values on any basis for their domain. A matrix records how a linear operator maps an element of the basis to a sum of multiples in the target space basis.
More carefully, if
Remark
To calculate the matrix of a linear transformation you must compute what the linear transformation does to every input basis vector and then write the answers in terms of the output basis vectors:
Example
Consider
By linearity this specifies the action of
We had trouble expressing this linear operator as a matrix. Lets take input basis
and output basis
Then
or
The matrix on the right is the matrix of
and thus see that
Hence
given input and output bases, the linear operator is now encoded by a matrix.
This is the general rule for this chapter:
Example
Lets compute a matrix for the derivative operator acting on the vector space of polynomials of degree 2 or less:
In the ordered basis
and
In the ordered basis
Notice this last equation makes no sense without explaining which bases we are using!
Contributor
David Cherney, Tom Denton, and Andrew Waldron (UC Davis)


