Processing math: 95%
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Mathematics LibreTexts

5.6: Isomorphisms

( \newcommand{\kernel}{\mathrm{null}\,}\)

Outcomes

  1. Determine if a linear transformation is an isomorphism.
  2. Determine if two subspaces of Rn are isomorphic.

Recall the definition of a linear transformation. Let V and W be two subspaces of Rn and Rm respectively. A mapping T:VW is called a linear transformation or linear map if it preserves the algebraic operations of addition and scalar multiplication. Specifically, if a,b are scalars and x,y are vectors,

T(ax+by)=aT(x)+bT(y)

Consider the following important definition.

Definition 5.6.1: Isomorphism

A linear map T is called an isomorphism if the following two conditions are satisfied.

  • T is one to one. That is, if T(x)=T(y), then x=y.
  • T is onto. That is, if wW, there exists vV such that T(v)=w.

Two such subspaces which have an isomorphism as described above are said to be isomorphic.

Consider the following example of an isomorphism.

Example 5.6.1:Isomorphism

Let T:R2R2 be defined by T[xy]=[x+yxy] Show that T is an isomorphism.

Solution

To prove that T is an isomorphism we must show

  1. T is a linear transformation;
  2. T is one to one;
  3. T is onto.

We proceed as follows.

  1. T is a linear transformation:

    Let k,p be scalars. T(k[x1y1]+p[x2y2])=T([kx1ky1]+[px2py2])=T([kx1+px2ky1+py2])=[(kx1+px2)+(ky1+py2)(kx1+px2)(ky1+py2)]=[(kx1+ky1)+(px2+py2)(kx1ky1)+(px2py2)]=[kx1+ky1kx1ky1]+[px2+py2px2py2]=k[x1+y1x1y1]+p[x2+y2x2y2]=kT([x1y1])+pT([x2y2])

    Therefore T is linear.

  2. T is one to one:

    We need to show that if T(x)=0 for a vector xR2, then it follows that x=0. Let x=[xy].

    T([xy])=[x+yxy]=[00] This provides a system of equations given by x+y=0xy=0 You can verify that the solution to this system if x=y=0. Therefore x=[xy]=[00] and T is one to one.

  3. T is onto:

    Let a,b be scalars. We want to check if there is always a solution to T([xy])=[x+yxy]=[ab]

    This can be represented as the system of equations x+y=axy=b

    Setting up the augmented matrix and row reducing gives [11a11b][10a+b201ab2] This has a solution for all a,b and therefore T is onto.

Therefore T is an isomorphism.

Below is a video on one to one and onto functions (isomorphisms).

An important property of isomorphisms is that its inverse is also an isomorphism.

Proposition 5.6.1:Inverse of an Isomorphism

Let T:VW be an isomorphism and V,W be subspaces of Rn. Then T1:WV is also an isomorphism.

Proof

Let T be an isomorphism. Since T is onto, a typical vector in W is of the form T(v) where vV. Consider then for a,b scalars, T1(aT(v1)+bT(v2)) where v1,v2V. Is this equal to aT1(T(v1))+bT1(T(v2))=av1+bv2? Since T is one to one, this will be so if T(av1+bv2)=T(T1(aT(v1)+bT(v2)))=aT(v1)+bT(v2). However, the above statement is just the condition that T is a linear map. Thus T1 is indeed a linear map. If vV is given, then v=T1(T(v)) and so T1 is onto. If T1(v)=0, then v=T(T1(v))=T(0)=0 and so T1 is one to one.

Another important result is that the composition of multiple isomorphisms is also an isomorphism.

Proposition 5.6.2:Composition of Isomorphisms

Let T:VW and S:WZ be isomorphisms where V,W,Z are subspaces of Rn. Then ST defined by (ST)(v)=S(T(v)) is also an isomorphism.

Proof

Suppose T:VW and S:WZ are isomorphisms. Why is ST a linear map? For a,b scalars,

ST(av1+b(v2))=S(T(av1+bv2))=S(aTv1+bTv2)=aS(Tv1)+bS(Tv2)=a(ST)(v1)+b(ST)(v2)

Hence ST is a linear map. If (ST)(v)=0, then S(T(v))=0 and it follows that T(v)=0 and hence by this lemma again, v=0. Thus ST is one to one. It remains to verify that it is onto. Let zZ. Then since S is onto, there exists wW such that S(w)=z. Also, since T is onto, there exists vV such that T(v)=w. It follows that S(T(v))=z and so ST is also onto.

Consider two subspaces V and W, and suppose there exists an isomorphism mapping one to the other. In this way the two subspaces are related, which we can write as VW. Then the previous two propositions together claim that is an equivalence relation. That is: satisfies the following conditions:

  • VV
  • If VW, it follows that WV
  • If VW and WZ, then VZ

We leave the verification of these conditions as an exercise.

Consider the following example.

Example 5.6.2:Matrix Isomorphism

Let T:RnRn be defined by T(x)=A(x) where A is an invertible n×n matrix. Then T is an isomorphism.

Solution

The reason for this is that, since A is invertible, the only vector it sends to 0 is the zero vector. Hence if A(x)=A(y), then A(xy)=0 and so x=y. It is onto because if

yRn,A(A1(y))=(AA1)(y)=y.

In fact, all isomorphisms from Rn to Rn can be expressed as T(x)=A(x) where A is an invertible n×n matrix. One simply considers the matrix whose ith column is Tei.

Recall that a basis of a subspace V is a set of linearly independent vectors which span V. The following fundamental lemma describes the relation between bases and isomorphisms.

Lemma 5.6.1:Mapping Bases

Let T:VW be a linear transformation where V,W are subspaces of Rn. If T is one to one, then it has the property that if {u1,,uk} is linearly independent, so is {T(u1),,T(uk)}.

More generally, T is an isomorphism if and only if whenever {v1,,vn} is a basis for V, it follows that {T(v1),,T(vn)} is a basis for W.

Proof

First suppose that T is a linear transformation and is one to one and {u1,,uk} is linearly independent. It is required to show that {T(u1),,T(uk)} is also linearly independent. Suppose then that ki=1ciT(ui)=0 Then, since T is linear, T(ni=1ciui)=0 Since T is one to one, it follows that ni=1ciui=0 Now the fact that {u1,,un} is linearly independent implies that each ci=0. Hence {T(u1),,T(un)} is linearly independent.

Now suppose that T is an isomorphism and {v1,,vn} is a basis for V. It was just shown that {T(v1),,T(vn)} is linearly independent. It remains to verify that span{T(v1),,T(vn)}=W. If wW, then since T is onto there exists vV such that T(v)=w. Since {v1,,vn} is a basis, it follows that there exists scalars {ci}ni=1 such that ni=1civi=v. Hence, w=T(v)=T(ni=1civi)=ni=1ciT(vi) It follows that span{T(v1),,T(vn)}=W showing that this set of vectors is a basis for W.

Next suppose that T is a linear transformation which takes a basis to a basis. This means that if {v1,,vn} is a basis for V, it follows {T(v1),,T(vn)} is a basis for W. Then if wW, there exist scalars ci such that w=ni=1ciT(vi)=T(ni=1civi) showing that T is onto. If T(ni=1civi)=0 then ni=1ciT(vi)=0 and since the vectors {T(v1),,T(vn)} are linearly independent, it follows that each ci=0. Since ni=1civi is a typical vector in V, this has shown that if T(v)=0 then v=0 and so T is also one to one. Thus T is an isomorphism.

The following theorem illustrates a very useful idea for defining an isomorphism. Basically, if you know what it does to a basis, then you can construct the isomorphism.

Theorem 5.6.1:Isomorphic Subspaces

Suppose V and W are two subspaces of Rn. Then the two subspaces are isomorphic if and only if they have the same dimension. In the case that the two subspaces have the same dimension, then for a linear map T:VW, the following are equivalent.

  1. T is one to one.
  2. T is onto.
  3. T is an isomorphism.
Proof

Suppose first that these two subspaces have the same dimension. Let a basis for V be {v1,,vn} and let a basis for W be {w1,,wn}. Now define T as follows. T(vi)=wi for ni=1civi an arbitrary vector of V, T(ni=1civi)=ni=1ciTvi=ni=1ciwi. It is necessary to verify that this is well defined. Suppose then that ni=1civi=ni=1ˆcivi Then ni=1(ciˆci)vi=0 and since {v1,,vn} is a basis, ci=ˆci for each i. Hence ni=1ciwi=ni=1ˆciwi and so the mapping is well defined. Also if a,b are scalars, T(ani=1civi+bni=1ˆcivi)=T(ni=1(aci+bˆci)vi)=ni=1(aci+bˆci)wi=ani=1ciwi+bni=1ˆciwi=aT(ni=1civi)+bT(ni=1ˆcivi) Thus T is a linear transformation.

Now if T(ni=1civi)=ni=1ciwi=0, then since the {w1,,wn} are independent, each ci=0 and so ni=1civi=0 also. Hence T is one to one. If ni=1ciwi is a vector in W, then it equals ni=1ciT(vi)=T(ni=1civi) showing that T is also onto. Hence T is an isomorphism and so V and W are isomorphic.

Next suppose T:VW is an isomorphism, so these two subspaces are isomorphic. Then for {v1,,vn} a basis for V, it follows that a basis for W is {T(v1),,T(vn)} showing that the two subspaces have the same dimension.

Now suppose the two subspaces have the same dimension. Consider the three claimed equivalences.

First consider the claim that 1.)2.). If T is one to one and if {v1,,vn} is a basis for V, then {T(v1),,T(vn)} is linearly independent. If it is not a basis, then it must fail to span W. But then there would exist wspan{T(v1),,T(vn)} and it follows that {T(v1),,T(vn),w} would be linearly independent which is impossible because there exists a basis for W of n vectors.

Hence span{T(v1),,T(vn)}=W and so {T(v1),,T(vn)} is a basis. If wW, there exist scalars ci such that w=ni=1ciT(vi)=T(ni=1civi) showing that T is onto. This shows that 1.)2.).

Next consider the claim that 2.)3.). Since 2.) holds, it follows that T is onto. It remains to verify that T is one to one. Since T is onto, there exists a basis of the form {T(vi),,T(vn)}. Then it follows that {v1,,vn} is linearly independent. Suppose ni=1civi=0 Then ni=1ciT(vi)=0 Hence each ci=0 and so, {v1,,vn} is a basis for V. Now it follows that a typical vector in V is of the form ni=1civi. If T(ni=1civi)=0, it follows that ni=1ciT(vi)=0 and so, since {T(vi),,T(vn)} is independent, it follows each ci=0 and hence ni=1civi=0. Thus T is one to one as well as onto and so it is an isomorphism.

If T is an isomorphism, it is both one to one and onto by definition so 3.) implies both 1.) and 2.).

Note the interesting way of defining a linear transformation in the first part of the argument by describing what it does to a basis and then “extending it linearly” to the entire subspace.

Example 5.6.4: Isomorphic Subspaces

Let V=R3 and let W denote span{[1211],[0101],[1120]} Show that V and W are isomorphic.

Solution

First observe that these subspaces are both of dimension 3 and so they are isomorphic by Theorem 5.6.1. The three vectors which span W are easily seen to be linearly independent by making them the columns of a matrix and row reducing to the reduced row-echelon form.

You can exhibit an isomorphism of these two spaces as follows. T(e1)=[1211],T(e2)=[0101],T(e3)=[1120] and extend linearly. Recall that the matrix of this linear transformation is just the matrix having these vectors as columns. Thus the matrix of this isomorphism is [101211102110] You should check that multiplication on the left by this matrix does reproduce the claimed effect resulting from an application by T.

Consider the following example.

Example 5.6.5:Finding the Matrix of an Isomorphism

Let V=R3 and let W denote

span{[1211],[0101],[1120]}

Let T:VW be defined as follows. T[110]=[1211],T[011]=[0101],T[111]=[1120] Find the matrix of this isomorphism T.

Solution

First note that the vectors [110],[011],[111] are indeed a basis for R3 as can be seen by making them the columns of a matrix and using the reduced row-echelon form.

Now recall the matrix of T is a 4×3 matrix A which gives the same effect as T. Thus, from the way we multiply matrices, A\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{array} \right ] =\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 2 & 1 & 1 \\ 1 & 0 & 2 \\ 1 & 1 & 0 \end{array} \right ]\nonumber Hence, A=\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 2 & 1 & 1 \\ 1 & 0 & 2 \\ 1 & 1 & 0 \end{array} \right ] \left [ \begin{array}{rrr} 1 & 0 & 1 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{array} \right ] ^{-1}=\left [ \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 2 & -1 \\ 2 & -1 & 1 \\ -1 & 2 & -1 \end{array} \right ]\nonumber

Note how the span of the columns of this new matrix must be the same as the span of the vectors defining W.

This idea of defining a linear transformation by what it does on a basis works for linear maps which are not necessarily isomorphisms.

Example \PageIndex{6}: Finding the Matrix of an Isomorphism

Let V=\mathbb{R}^{3} and let W denote \mathrm{span}\left\{ \left [ \begin{array}{c} 1 \\ 0 \\ 1 \\ 1 \end{array} \right ] ,\left [ \begin{array}{c} 0 \\ 1 \\ 0 \\ 1 \end{array} \right ] ,\left [ \begin{array}{c} 1 \\ 1 \\ 1 \\ 2 \end{array} \right ] \right\}\nonumber Let T: V \mapsto W be defined as follows. T\left [ \begin{array}{c} 1 \\ 1 \\ 0 \end{array} \right ] = \left [ \begin{array}{c} 1 \\ 0 \\ 1 \\ 1 \end{array} \right ] ,T\left [ \begin{array}{c} 0 \\ 1 \\ 1 \end{array} \right ] =\left [ \begin{array}{c} 0 \\ 1 \\ 0 \\ 1 \end{array} \right ] ,T\left [ \begin{array}{c} 1 \\ 1 \\ 1 \end{array} \right ] =\left [ \begin{array}{c} 1 \\ 1 \\ 1 \\ 2 \end{array} \right ]\nonumber Find the matrix of this linear transformation.

Solution

Note that in this case, the three vectors which span W are not linearly independent. Nevertheless the above procedure will still work. The reasoning is the same as before. If A is this matrix, then A\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{array} \right ] =\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 2 \end{array} \right ]\nonumber and so A=\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 2 \end{array} \right ] \left [ \begin{array}{rrr} 1 & 0 & 1 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{array} \right ] ^{-1}=\left [ \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \\ 1 & 0 & 1 \end{array} \right ]\nonumber

The columns of this last matrix are obviously not linearly independent.

Below is a video on determining a function output given an isomorphic transformation.

 


This page titled 5.6: Isomorphisms is shared under a CC BY 4.0 license and was authored, remixed, and/or curated by Ken Kuttler (Lyryx) .

  • Was this article helpful?

Support Center

How can we help?

5.5: One-to-One and Onto Transformations
5.7: The Kernel and Image of A Linear Map