9.7: Isomorphisms
- Page ID
- 29491
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)
( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\id}{\mathrm{id}}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\kernel}{\mathrm{null}\,}\)
\( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\)
\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\)
\( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)
\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)
\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)
\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vectorC}[1]{\textbf{#1}} \)
\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)
\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)
\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)- Apply the concepts of one to one and onto to transformations of vector spaces.
- Determine if a linear transformation of vector spaces is an isomorphism.
- Determine if two vector spaces are isomorphic.
One to One and Onto Transformations
Recall the following definitions, given here in terms of vector spaces.
Let \(V, W\) be vector spaces with \(\vec{v}_1, \vec{v}_2\) vectors in \(V\). Then a linear transformation \(T: V \mapsto W\) is called one to one if whenever \(\vec{v}_1 \neq \vec{v}_2\) it follows that \[T(\vec{v}_1) \neq T (\vec{v}_2)\nonumber \]
Let \(V, W\) be vector spaces. Then a linear transformation \(T: V \mapsto W\) is called onto if for all \(\vec{w} \in \vec{W}\) there exists \(\vec{v} \in V\) such that \(T(\vec{v}) = \vec{w}\).
Recall that every linear transformation \(T\) has the property that \(T(\vec{0})=\vec{0}\). This will be necessary to prove the following useful lemma.
The assertion that a linear transformation \(T\) is one to one is equivalent to saying that if \(T(\vec{v})=\vec{0},\) then \(\vec{v}=0.\)
- Proof
-
Suppose first that \(T\) is one to one.
\[T(\vec{0})=T\left( \vec{0}+\vec{0}\right) =T(\vec{0})+T(\vec{0})\nonumber \] and so, adding the additive inverse of \(T(\vec{0})\) to both sides, one sees that \(T(\vec{0})=\vec{0}\). Therefore, if \(T(\vec{v})=\vec{0},\) it must be the case that \(\vec{v}=\vec{0}\) because it was just shown that \(T(\vec{0})=\vec{0}\).
Now suppose that if \(T(\vec{v})=\vec{0},\) then \(\vec{v}=0.\) If \(T(\vec{v})=T(\vec{u}),\) then \(T(\vec{v})-T(\vec{u})=T\left( \vec{v}-\vec{u}\right) =\vec{0}\) which shows that \(\vec{v}-\vec{u}=0\) or in other words, \(\vec{v}=\vec{u}\).
Consider the following example.
Let \(S:\mathbb{P}_2\to\mathbb{M}_{22}\) be a linear transformation defined by \[S(ax^2+bx+c) = \left [\begin{array}{cc} a+b & a+c \\ b-c & b+c \end{array}\right ] \nonumber\] for all \(ax^2+bx+c\in \mathbb{P}_2.\)
Prove that \(S\) is one to one but not onto.
Solution
By definition, \[\ker(S)=\{ax^2+bx+c\in \mathbb{P}_2 ~|~ a+b=0, a+c=0, b-c=0, b+c=0\}. \nonumber\]
Suppose \(p(x)=ax^2+bx+c\in\ker(S)\). This leads to a homogeneous system of four equations in three variables. Putting the augmented matrix in reduced row-echelon form:
\[\left [\begin{array}{rrr|c} 1 & 1 & 0 & 0 \\ 1 & 0 & 1 & 0 \\ 0 & 1 & -1 & 0 \\ 0 & 1 & 1 & 0 \end{array}\right ] \rightarrow \cdots \rightarrow \left [\begin{array}{ccc|c} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 \end{array}\right ].\nonumber\]
The solution is \(a=b=c=0\). This tells us that if \(S(p(x)) = 0\), then \(p(x) = ax^2+bx+c = 0x^2 + 0x + 0 = 0\). Therefore it is one to one.
To show that \(S\) is not onto, find a matrix \(A\in\mathbb{M}_{22}\) such that for every \(p(x)\in \mathbb{P}_2\), \(S(p(x))\neq A\). Let \[A=\left [\begin{array}{cc} 0 & 1 \\ 0 & 2 \end{array}\right ], \nonumber\] and suppose \(p(x)=ax^2+bx+c\in \mathbb{P}_2\) is such that \(S(p(x))=A\). Then \[\begin{array}{ll} a+b=0 & a+c=1 \\ b-c=0 & b+c=2 \end{array}\nonumber \] Solving this system \[\left [\begin{array}{ccc|c} 1 & 1 & 0 & 0 \\ 1 & 0 & 1 & 1 \\ 0 & 1 & -1 & 0 \\ 0 & 1 & 1 & 2 \end{array}\right ] \rightarrow \left [\begin{array}{rrr|r} 1 & 1 & 0 & 0 \\ 0 & -1 & 1 & 1 \\ 0 & 1 & -1 & 0 \\ 0 & 1 & 1 & 2 \end{array}\right ]. \nonumber\]
Since the system is inconsistent, there is no \(p(x)\in \mathbb{P}_2\) so that \(S(p(x))=A\), and therefore \(S\) is not onto.
Let \(T:\mathbb{M}_{22}\to\mathbb{R}^2\) be a linear transformation defined by \[T\left [\begin{array}{cc} a & b \\ c & d \end{array}\right ] = \left [\begin{array}{c} a+d \\ b+c \end{array}\right ] \mbox{ for all } \left [\begin{array}{cc} a & b \\ c & d \end{array}\right ] \in\mathbb{M}_{22}.\nonumber\]
Prove that \(T\) is onto but not one to one.
Solution
Let \(\left [\begin{array}{c} x \\ y \end{array}\right ]\) be an arbitrary vector in \(\mathbb{R}^2\). Since \(T\left [\begin{array}{cc} x & y \\ 0 & 0 \end{array}\right ] =\left [\begin{array}{c} x \\ y \end{array}\right ]\), \(T\) is onto.
By Lemma \(\PageIndex{1}\) \(T\) is one to one if and only if \(T(A) = \vec{0}\) implies that \(A = 0\) the zero matrix. Observe that \[T \left( \left [\begin{array}{cc} 1 & 0 \\ 0 & -1 \end{array}\right ] \right) = \left [ \begin{array}{c} 1 + -1 \\ 0 + 0 \end{array} \right ] = \left [ \begin{array}{c} 0 \\ 0 \end{array} \right ]\nonumber\]
There exists a nonzero matrix \(A\) such that \(T(A) = \vec{0}\). It follows that \(T\) is not one to one.
The following example demonstrates that a one to one transformation preserves linear independence.
Let \(V\) and \(W\) be vector spaces and \(T: V \mapsto W\) a linear transformation. Prove that if \(T\) is one to one and \(\{\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_k\}\) is an independent subset of \(V\), then \(\{T(\vec{v}_1), T(\vec{v}_2), \ldots, T(\vec{v}_k)\}\) is an independent subset of \(W\).
Solution
Let \(\vec{0}_V\) and \(\vec{0}_W\) denote the zero vectors of \(V\) and \(W\), respectively. Suppose that \[a_1T(\vec{v}_1) + a_2T(\vec{v}_2) +\cdots +a_kT(\vec{v}_k) =\vec{0}_W \nonumber\] for some \(a_1, a_2, \ldots, a_k\in\mathbb{R}\). Since linear transformations preserve linear combinations (addition and scalar multiplication), \[T(a_1\vec{v}_1 + a_2\vec{v}_2 +\cdots +a_k\vec{v}_k) =\vec{0}_W. \nonumber\]
Now, since \(T\) is one to one, \(\ker(T)=\{\vec{0}_V\}\), and thus \[a_1\vec{v}_1 + a_2\vec{v}_2 +\cdots +a_k\vec{v}_k =\vec{0}_V. \nonumber\]
However, \(\{\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_k\}\) is independent so \(a_1=a_2=\cdots=a_k=0\). Therefore, \(\{T(\vec{v}_1), T(\vec{v}_2), \ldots, T(\vec{v}_k)\}\) is independent.
A similar claim can be made regarding onto transformations. In this case, an onto transformation preserves a spanning set.
Let \(V\) and \(W\) be vector spaces and \(T:V\to W\) a linear transformation. Prove that if \(T\) is onto and \(V=span\{\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_k\}\), then \[W=span\{T(\vec{v}_1), T(\vec{v}_2), \ldots, T(\vec{v}_k)\}.\nonumber\]
Solution
Suppose that \(T\) is onto and let \(\vec{w}\in W\). Then there exists \(\vec{v}\in V\) such that \(T(\vec{v})=\vec{w}\). Since \(V=span\{\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_k\}\), there exist \(a_1, a_2, \ldots a_k\in\mathbb{R}\) such that \(\vec{v} = a_1\vec{v}_1 + a_2\vec{v}_2 + \cdots + a_k\vec{v}_k\). Using the fact that \(T\) is a linear transformation, \[\begin{aligned} \vec{w} =T(\vec{v}) & = T(a_1\vec{v}_1 + a_2\vec{v}_2 + \cdots + a_k\vec{v}_k) \\ & = a_1T(\vec{v}_1) + a_2T(\vec{v}_2) + \cdots + a_kT(\vec{v}_k),\end{aligned}\] i.e., \(\vec{w}\in span\{T(\vec{v}_1), T(\vec{v}_2), \ldots, T(\vec{v}_k)\}\), and thus \[W\subseteq span\{T(\vec{v}_1), T(\vec{v}_2), \ldots, T(\vec{v}_k)\}. \nonumber\]
Since \(T(\vec{v}_1), T(\vec{v}_2), \ldots, T(\vec{v}_k)\in W\), it follows from that \(span\{T(\vec{v}_1), T(\vec{v}_2), \ldots, T(\vec{v}_k)\}\subseteq W\), and therefore \(W=span\{T(\vec{v}_1), T(\vec{v}_2), \ldots, T(\vec{v}_k)\}\).
Isomorphisms
The focus of this section is on linear transformations which are both one to one and onto. When this is the case, we call the transformation an isomorphism.
Let \(V\) and \(W\) be two vector spaces and let \(T: V \mapsto W\) be a linear transformation. Then \(T\) is called an isomorphism if the following two conditions are satisfied.
- \(T\) is one to one.
- \(T\) is onto.
Let \(V\) and \(W\) be two vector spaces and let \(T: V \mapsto W\) be a linear transformation. Then if \(T\) is an isomorphism, we say that \(V\) and \(W\) are isomorphic.
Consider the following example of an isomorphism.
Let \(T:\mathbb{M}_{22}\to\mathbb{R}^4\) be defined by \[T \left( \begin{array}{cc} a & b \\ c & d \end{array} \right) = \left[ \begin{array}{c} a\\ b\\ c \\ d \end{array} \right] \mbox{ for all } \left[ \begin{array}{cc} a & b \\ c & d \end{array} \right] \in\mathbb{M}_{22}.\nonumber \] Show that \(T\) is an isomorphism.
Solution
Notice that if we can prove \(T\) is an isomorphism, it will mean that \(\mathbb{M}_{22}\) and \(\mathbb{R}^4\) are isomorphic. It remains to prove that
- \(T\) is a linear transformation;
- \(T\) is one-to-one;
- \(T\) is onto.
\(T\) is linear: Let \(k,p\) be scalars.
\[\begin{aligned} T \left( k \left[\begin{array}{cc} a_1 & b_1 \\ c_1 & d_1 \end{array}\right] + p \left[\begin{array}{cc} a_2 & b_2 \\ c_2 & d_2 \end{array}\right] \right) &= T \left( \left[\begin{array}{cc} k a_1 & k b_1 \\ k c_1 & k d_1 \end{array}\right] + \left[\begin{array}{cc} p a_2 & p b_2 \\ p c_2 & p d_2 \end{array}\right] \right) \\ &= T \left( \left[\begin{array}{cc} k a_1 + p a_2 & k b_1 + p b_2 \\ k c_1 + p c_2& k d_1 + p d_2 \end{array}\right] \right) \\ &= \left[ \begin{array}{c} k a_1 + p a_2 \\ k b_1 + p b_2 \\ k c_1 + p c_2 \\ k d_1 + p d_2 \end{array}\right] \\ &= \left[ \begin{array}{c} k a_1 \\ k b_1 \\ k c_1 \\ k d_1 \end{array} \right] + \left[ \begin{array}{c} p a_2 \\ p b_2 \\ p c_2 \\ p d_2 \end{array} \right] \\ &= k \left[ \begin{array}{c} a_1 \\ b_1 \\ c_1 \\ d_1 \end{array} \right] + p \left[ \begin{array}{c} a_2 \\ b_2 \\ c_2 \\ d_2 \end{array} \right] \\ &= k T \left(\left[\begin{array}{cc} a_1 & b_1 \\ c_1 & d_1 \end{array}\right] \right) + p T \left(\left[\begin{array}{cc} a_2 & b_2 \\ c_2 & d_2 \end{array}\right] \right)\end{aligned}\]
Therefore \(T\) is linear.
\(T\) is one-to-one: By Lemma \(\PageIndex{1}\) we need to show that if \(T(A) = 0\) then \(A = 0\) for some matrix \(A \in \mathbb{M}_{22}\). \[T\left[\begin{array}{cc} a & b \\ c & d \end{array}\right] = \left[\begin{array}{c} a\\ b\\ c \\ d \end{array}\right] = \left[\begin{array}{c} 0 \\ 0 \\ 0 \\ 0 \end{array}\right]\nonumber \]
This clearly only occurs when \(a=b=c=d=0\) which means that \[A = \left[\begin{array}{cc} a & b \\ c & d \end{array}\right] = \left[\begin{array}{cc} 0 & 0 \\ 0 & 0 \end{array}\right] = 0\nonumber \]
Hence \(T\) is one-to-one.
\(T\) is onto: Let
\[\vec{x}=\left[\begin{array}{c} x_1\\x_2\\x_3\\x_4 \end{array}\right]\in\mathbb{R}^4, \nonumber\] and define matrix \(A\in\mathbb{M}_{22}\) as follows: \[A=\left[\begin{array}{cc} x_1 & x_2 \\ x_3 & x_4 \end{array}\right]. \nonumber\]
Then \(T(A)=\vec{x}\), and therefore \(T\) is onto.
Since \(T\) is a linear transformation which is one-to-one and onto, \(T\) is an isomorphism. Hence \(\mathbb{M}_{22}\) and \(\mathbb{R}^4\) are isomorphic.
An important property of isomorphisms is that the inverse of an isomorphism is itself an isomorphism and the composition of isomorphisms is an isomorphism. We first recall the definition of composition.
Let \(V, W, Z\) be vector spaces and suppose \(T: V \mapsto W\) and \(S: W \mapsto Z\) are linear transformations. Then the composite of \(S\) and \(T\) is \[S \circ T: V \mapsto Z\nonumber \] and is defined by \[(S \circ T) (\vec{v}) = S(T(\vec{v})) \mbox{ for all } \vec{v} \in V\nonumber \]
Consider now the following proposition.
Let \(T:V\rightarrow W\) be an isomorphism. Then \(T^{-1}:W\rightarrow V\) is also an isomorphism. Also if \(T:V\rightarrow W\) is an isomorphism and if \(S:W\rightarrow Z\) is an isomorphism for the vector spaces \(V,W,Z,\) then \(S\circ T\) defined by \(\left( S\circ T\right) \left( v\right) = S\left( T\left( v\right) \right)\) is also an isomorphism.
- Proof
-
Consider the first claim. Since \(T\) is onto, a typical vector in \(W\) is of the form \(T(\vec{v})\) where \(\vec{v} \in V\). Consider then for \(a,b\) scalars, \[T^{-1}\left( aT(\vec{v}_{1})+bT(\vec{v}_{2})\right)\nonumber \] where \(\vec{v}_{1}, \vec{v}_2 \in V\). Consider if this is equal to \[aT^{-1}\left( T(\vec{v}_{1})\right) +bT^{-1}\left( T(\vec{v}_{2})\right) =a\vec{v}_{1}+b\vec{v}_{2}?\nonumber \] Since \(T\) is one to one, this will be so if \[T\left( a\vec{v}_{1}+b\vec{v}_{2}\right) =T\left( T^{-1}\left( aT(\vec{v}_{1})+bT(\vec{v}_{2})\right) \right) =aT(\vec{v}_{1})+bT(\vec{v}_{2})\nonumber \] However, the above statement is just the condition that \(T\) is a linear map. Thus \(T^{-1}\) is indeed a linear map. If \(\vec{v} \in V\) is given, then \(\vec{v}=T^{-1}\left( T(\vec{v})\right)\) and so \(T^{-1}\) is onto. If \(T^{-1}(\vec{v})=\vec{0},\) then \[\vec{v}=T\left( T^{-1}(\vec{v})\right) =T(\vec{0})=\vec{0}\nonumber \] and so \(T^{-1}\) is one to one.
Next suppose \(T\) and \(S\) are as described. Why is \(S\circ T\) a linear map? Let for \(a,b\) scalars, \[\begin{aligned} S\circ T\left( a\vec{v}_{1}+b\vec{v}_{2}\right) &\equiv S\left( T\left( a\vec{v}_{1}+b\vec{v}_{2}\right) \right) =S\left( aT(\vec{v}_{1})+bT(\vec{v}_{2})\right) \\ &=aS\left( T(\vec{v}_{1})\right) +bS\left( T(\vec{v}_{2})\right) \equiv a\left( S\circ T\right) \left( \vec{v}_{1}\right) +b\left( S\circ T\right) \left( \vec{v}_{2}\right)\end{aligned}\] Hence \(S\circ T\) is a linear map. If \(\left( S\circ T\right) \left( \vec{v}\right) =0,\) then \(S\left( T\left( \vec{v} \right) \right) =\vec{0}\) and it follows that \(T(\vec{v})=\vec{0}\) and hence by this lemma again, \(\vec{v}=\vec{0}\). Thus \(S\circ T\) is one to one. It remains to verify that it is onto. Let \(\vec{z}\in Z\). Then since \(S\) is onto, there exists \(\vec{w}\in W\) such that \(S(\vec{w})=\vec{z}.\) Also, since \(T\) is onto, there exists \(\vec{v}\in V\) such that \(T(\vec{v})=\vec{w}.\) It follows that \(S\left( T\left( \vec{v}\right) \right) =\vec{z}\) and so \(S\circ T\) is also onto.
Suppose we say that two vector spaces \(V\) and \(W\) are related if there exists an isomorphism of one to the other, written as \(V\sim W\). Then the above proposition suggests that \(\sim\) is an equivalence relation. That is: \(\sim\) satisfies the following conditions:
- \(V\sim V\)
- If \(V\sim W,\) it follows that \(W\sim V\)
- If \(V\sim W\) and \(W\sim Z,\) then \(V\sim Z\)
We leave the proof of these to the reader.
The following fundamental lemma describes the relation between bases and isomorphisms.
Let \(T:V\rightarrow W\) be a linear map where \(V,W\) are vector spaces. Then a linear transformation \(T\) which is one to one has the property that if \(\left\{ \vec{u}_{1},\cdots ,\vec{u}_{k}\right\}\) is linearly independent, then so is \(\left\{ T(\vec{u}_{1}),\cdots ,T(\vec{u}_{k})\right\}\). More generally, \(T\) is an isomorphism if and only if whenever \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) is a basis for \(V,\) it follows that \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) is a basis for \(W\).
- Proof
-
First suppose that \(T\) is a linear map and is one to one and \(\left\{ \vec{u}_{1},\cdots ,\vec{u}_{k}\right\}\) is linearly independent. It is required to show that \(\left\{ T(\vec{u}_{1}),\cdots ,T(\vec{u}_{k})\right\}\) is also linearly independent. Suppose then that \[\sum_{i=1}^{k}c_{i}T(\vec{u}_{i})=\vec{0}\nonumber \] Then, since \(T\) is linear, \[T\left( \sum_{i=1}^{n}c_{i}\vec{u}_{i}\right) =\vec{0}\nonumber \] Since \(T\) is one to one, it follows that \[\sum_{i=1}^{n}c_{i}\vec{u}_{i}=0\nonumber \] Now the fact that \(\left\{ \vec{u}_{1},\cdots ,\vec{u}_{n}\right\}\) is linearly independent implies that each \(c_{i}=0\). Hence \(\left\{ T(\vec{u} _{1}),\cdots ,T(\vec{u}_{n})\right\}\) is linearly independent.
Now suppose that \(T\) is an isomorphism and \(\left\{ \vec{v}_{1},\cdots ,\vec{ v}_{n}\right\}\) is a basis for \(V\). It was just shown that \(\left\{ T(\vec{v} _{1}),\cdots ,T(\vec{v}_{n})\right\}\) is linearly independent. It remains to verify that the span of \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) is all of \(W\). This is where \(T\) is onto is used. If \(\vec{w}\in W,\) there exists \(\vec{v}\in V\) such that \(T(\vec{v})=\vec{w}\). Since \(\left\{ \vec{v} _{1},\cdots ,\vec{v}_{n}\right\}\) is a basis, it follows that there exists scalars \(\left\{ c_{i}\right\} _{i=1}^{n}\) such that \[\sum_{i=1}^{n}c_{i}\vec{v}_{i}=\vec{v}.\nonumber \] Hence, \[\vec{w}=T(\vec{v})=T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) =\sum_{i=1}^{n}c_{i}T\vec{v}_{i}\nonumber \] which shows that the span of these vectors \(\left\{ T(\vec{v}_{1}),\cdots ,T (\vec{v}_{n})\right\}\) is all of \(W\) showing that this set of vectors is a basis for \(W\).
Next suppose that \(T\) is a linear map which takes a basis to a basis. Then for \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) a basis for \(V,\) it follows \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) is a basis for \(W.\) Then if \(w\in W,\) there exist scalars \(c_{i}\) such that \(w=\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right)\) showing that \(T\) is onto. If \(T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) =0\) then \(\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=\vec{0}\) and since the vectors \(\left\{ T(\vec{v} _{1}),\cdots ,T(\vec{v}_{n})\right\}\) are linearly independent, it follows that each \(c_{i}=0.\) Since \(\sum_{i=1}^{n}c_{i}\vec{v}_{i}\) is a typical vector in \(V\), this has shown that if \(T(\vec{v})=0\) then \(\vec{v}=\vec{0}\) and so \(T\) is also one to one. Thus \(T\) is an isomorphism.
The following theorem illustrates a very useful idea for defining an isomorphism. Basically, if you know what it does to a basis, then you can construct the isomorphism.
Suppose \(V\) and \(W\) are two vector spaces. Then the two vector spaces are isomorphic if and only if they have the same dimension. In the case that the two vector spaces have the same dimension, then for a linear transformation \(T:V\rightarrow W\), the following are equivalent.
- \(T\) is one to one.
- \(T\) is onto.
- \(T\) is an isomorphism.
- Proof
-
Suppose first these two vector spaces have the same dimension. Let a basis for \(V\) be \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) and let a basis for \(W\) be \(\left\{ \vec{w}_{1},\cdots ,\vec{w}_{n}\right\}\). Now define \(T\) as follows. \[T(\vec{v}_{i})=\vec{w}_{i}\nonumber \] for \(\sum_{i=1}^{n}c_{i}\vec{v}_{i}\) an arbitrary vector of \(V,\) \[T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) = \sum_{i=1}^{n}c_{i}T (\vec{v}_{i})=\sum_{i=1}^{n}c_{i}\vec{w}_{i}.\nonumber \] It is necessary to verify that this is well defined. Suppose then that \[\sum_{i=1}^{n}c_{i}\vec{v}_{i}=\sum_{i=1}^{n}\hat{c}_{i}\vec{v}_{i}\nonumber \] Then \[\sum_{i=1}^{n}\left( c_{i}-\hat{c}_{i}\right) \vec{v}_{i}=0\nonumber \] and since \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) is a basis, \(c_{i}=\hat{c}_{i}\) for each \(i\). Hence \[\sum_{i=1}^{n}c_{i}\vec{w}_{i}=\sum_{i=1}^{n}\hat{c}_{i}\vec{w}_{i}\nonumber \] and so the mapping is well defined. Also if \(a,b\) are scalars, \[\begin{aligned} T\left( a\sum_{i=1}^{n}c_{i}\vec{v}_{i}+b\sum_{i=1}^{n}\hat{c}_{i}\vec{v} _{i}\right) &=T\left( \sum_{i=1}^{n}\left( ac_{i}+b\hat{c}_{i}\right) \vec{v }_{i}\right) =\sum_{i=1}^{n}\left( ac_{i}+b\hat{c}_{i}\right) \vec{w}_{i} \\ &=a\sum_{i=1}^{n}c_{i}\vec{w}_{i}+b\sum_{i=1}^{n}\hat{c}_{i}\vec{w}_{i} \\ &=aT\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) +bT\left( \sum_{i=1}^{n} \hat{c}_{i}\vec{v}_{i}\right)\end{aligned}\] Thus \(T\) is a linear map.
Now if \[T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) =\sum_{i=1}^{n}c_{i}\vec{w} _{i}=\vec{0},\nonumber \] then since the \(\left\{ \vec{w}_{1},\cdots ,\vec{w}_{n}\right\}\) are independent, each \(c_{i}=0\) and so \(\sum_{i=1}^{n}c_{i}\vec{v}_{i}=\vec{0}\) also. Hence \(T\) is one to one. If \(\sum_{i=1}^{n}c_{i}\vec{w}_{i}\) is a vector in \(W,\) then it equals \[\sum_{i=1}^{n}c_{i}T\vec{v}_{i}=T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right)\nonumber \] showing that \(T\) is also onto. Hence \(T\) is an isomorphism and so \(V\) and \(W\) are isomorphic.
Next suppose these two vector spaces are isomorphic. Let \(T\) be the name of the isomorphism. Then for \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) a basis for \(V\), it follows that a basis for \(W\) is \(\left\{ T\vec{v}_{1},\cdots ,T\vec{v}_{n}\right\}\) showing that the two vector spaces have the same dimension.
Now suppose the two vector spaces have the same dimension.
First consider the claim that \(1.)\Rightarrow 2.).\) If \(T\) is one to one, then if \(\left\{ \vec{v}_{1},\cdots ,\vec{v} _{n}\right\}\) is a basis for \(V,\) then \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v }_{n})\right\}\) is linearly independent. If it is not a basis, then it must fail to span \(W\). But then there would exist \(\vec{w}\notin span \left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) and it follows that \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n}),\vec{w}\right\}\) would be linearly independent which is impossible because there exists a basis for \(W\) of \(n\) vectors. Hence \[span\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v} _{n})\right\} =W\nonumber \] and so \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) is a basis. Hence, if \(\vec{w}\in W,\) there exist scalars \(c_{i}\) such that \[\vec{w}=\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=T\left( \sum_{i=1}^{n}c_{i}\vec{v} _{i}\right)\nonumber \] showing that \(T\) is onto. This shows that \(1.)\Rightarrow 2.).\)
Next consider the claim that \(2.)\Rightarrow 3.).\) Since \(2.)\) holds, it follows that \(T\) is onto. It remains to verify that \(T\) is one to one. Since \(T\) is onto, there exists a basis of the form \(\left\{ T(\vec{v}_{i}),\cdots ,T (\vec{v}_{n})\right\} .\) If \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) is linearly independent, then this set of vectors must also be a basis for \(V\) because if not, there would exist \(\vec{u}\notin span\left\{ \vec{ v}_{1},\cdots ,\vec{v}_{n}\right\}\) so \(\left\{ \vec{v}_{1},\cdots ,\vec{v} _{n},\vec{u}\right\}\) would be a linearly independent set which is impossible because by assumption, there exists a basis which has \(n\) vectors. So why is\(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) linearly independent? Suppose \[\sum_{i=1}^{n}c_{i}\vec{v}_{i}=\vec{0}\nonumber \] Then \[\sum_{i=1}^{n}c_{i}T\vec{v}_{i}=\vec{0}\nonumber \] Hence each \(c_{i}=0\) and so, as just discussed, \(\left\{ \vec{v}_{1},\cdots , \vec{v}_{n}\right\}\) is a basis for \(V\). Now it follows that a typical vector in \(V\) is of the form \(\sum_{i=1}^{n}c_{i}\vec{v}_{i}\). If \(T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) =\vec{0},\) it follows that \[\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=\vec{0}\nonumber \] and so, since \(\left\{ T(\vec{v}_{i}),\cdots ,T(\vec{v}_{n})\right\}\) is independent, it follows each \(c_{i}=0\) and hence \(\sum_{i=1}^{n}c_{i}\vec{v} _{i}=\vec{0}\). Thus \(T\) is one to one as well as onto and so it is an isomorphism.
If \(T\) is an isomorphism, it is both one to one and onto by definition so \(3.)\) implies both \(1.)\) and \(2.)\).
Note the interesting way of defining a linear transformation in the first part of the argument by describing what it does to a basis and then “extending it linearly”.
Consider the following example.
Let \(V=\mathbb{R}^{3}\) and let \(W\) denote the polynomials of degree at most 2. Show that these two vector spaces are isomorphic.
Solution
First, observe that a basis for \(W\) is \(\left\{ 1,x,x^{2}\right\}\) and a basis for \(V\) is \(\left\{ \vec{e}_{1},\vec{e}_{2},\vec{e}_{3}\right\} .\) Since these two have the same dimension, the two are isomorphic. An example of an isomorphism is this:
\[T(\vec{e}_{1})=1,T(\vec{e}_{2})=x,T(\vec{e}_{3})=x^{2}\nonumber \] and extend \(T\) linearly as in the above proof. Thus \[T\left( a,b,c\right) =a+bx+cx^{2}\nonumber \]