Skip to main content
Mathematics LibreTexts

5.6: Isomorphisms

  • Page ID
    14529
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Outcomes

    1. Determine if a linear transformation is an isomorphism.
    2. Determine if two subspaces of \(\mathbb{R}^n\) are isomorphic.

    Recall the definition of a linear transformation. Let \(V\) and \(W\) be two subspaces of \(\mathbb{R}^{n}\) and \(\mathbb{R}^{m}\) respectively. A mapping \(T:V\rightarrow W\) is called a linear transformation or linear map if it preserves the algebraic operations of addition and scalar multiplication. Specifically, if \(a,b\) are scalars and \(\vec{x},\vec{y}\) are vectors,

    \[T\left( a\vec{x}+b\vec{y}\right) =aT(\vec{x})+bT(\vec{y})\nonumber \]

    Consider the following important definition.

    Definition \(\PageIndex{1}\): Isomorphism

    A linear map \(T\) is called an isomorphism if the following two conditions are satisfied.

    • \(T\) is one to one. That is, if \(T(\vec{x})=T(\vec{y}),\) then \(\vec{x}=\vec{y}.\)
    • \(T\) is onto. That is, if \(\vec{w}\in W,\) there exists \(\vec{v}\in V\) such that \(T(\vec{v})=\vec{w}\).

    Two such subspaces which have an isomorphism as described above are said to be isomorphic.

    Consider the following example of an isomorphism.

    Example \(\PageIndex{1}\): Isomorphism

    Let \(T: \mathbb{R}^2 \mapsto \mathbb{R}^2\) be defined by \[T \left [ \begin{array}{c} x \\ y \end{array} \right ] = \left [ \begin{array}{c} x + y \\ x - y \end{array} \right ] \nonumber\] Show that \(T\) is an isomorphism.

    Solution

    To prove that \(T\) is an isomorphism we must show

    1. \(T\) is a linear transformation;
    2. \(T\) is one to one;
    3. \(T\) is onto.

    We proceed as follows.

    1. \(T\) is a linear transformation:

      Let \(k, p\) be scalars. \[\begin{aligned} T \left( k \left [ \begin{array}{c} x_1 \\ y_1 \end{array} \right ] + p \left [ \begin{array}{c} x_2 \\ y_2 \end{array} \right ] \right) &= T \left( \left [ \begin{array}{c} kx_1 \\ ky_1 \end{array} \right ] + \left [ \begin{array}{c} px_2 \\ py_2 \end{array} \right ] \right) \\ &= T \left( \left [ \begin{array}{c} kx_1 + px_2 \\ ky_1 + py_2 \end{array} \right ] \right) \\ &= \left [ \begin{array}{c} (kx_1 + px_2) + (ky_1 + py_2) \\ (kx_1 + px_2) - (ky_1 + py_2) \end{array} \right ] \\ &= \left [ \begin{array}{c} (kx_1 + ky_1) + (px_2 + py_2) \\ (kx_1 - ky_1) + (px_2 - py_2) \end{array} \right ] \\ &= \left [ \begin{array}{c} kx_1 + ky_1 \\ kx_1 - ky_1 \end{array} \right ] + \left [ \begin{array}{c} px_2 + py_2 \\ px_2 - py_2 \end{array} \right ] \\ &= k \left [ \begin{array}{c} x_1 + y_1 \\ x_1 - y_1 \end{array} \right ] + p \left [ \begin{array}{c} x_2 + y_2 \\ x_2 - y_2 \end{array} \right ] \\ &= k T \left( \left [ \begin{array}{c} x_1 \\ y_1 \end{array} \right ] \right) + p T \left( \left [ \begin{array}{c} x_2 \\ y_2 \end{array} \right ] \right)\end{aligned}\]

      Therefore \(T\) is linear.

    2. \(T\) is one to one:

      We need to show that if \(T (\vec{x}) = \vec{0}\) for a vector \(\vec{x} \in \mathbb{R}^2\), then it follows that \(\vec{x} = \vec{0}\). Let \(\vec{x} = \left [ \begin{array}{c} x \\ y \end{array} \right ]\).

      \[T \left( \left [ \begin{array}{c} x \\ y \end{array} \right ] \right) = \left [ \begin{array}{c} x + y\\ x - y \end{array} \right ] = \left [ \begin{array}{c} 0 \\ 0 \end{array} \right ]\nonumber \] This provides a system of equations given by \[\begin{aligned} x + y &= 0\\ x - y &= 0\end{aligned}\] You can verify that the solution to this system if \(x = y =0\). Therefore \[\vec{x} = \left [ \begin{array}{c} x \\ y \end{array} \right ] = \left [ \begin{array}{c} 0 \\ 0 \end{array} \right ]\nonumber \] and \(T\) is one to one.

    3. \(T\) is onto:

      Let \(a,b\) be scalars. We want to check if there is always a solution to \[T \left( \left [ \begin{array}{c} x \\ y \end{array} \right ] \right) = \left [ \begin{array}{c} x + y\\ x - y \end{array} \right ] = \left [ \begin{array}{c} a \\ b \end{array} \right ]\nonumber\]

      This can be represented as the system of equations \[\begin{aligned} x + y &= a\\ x - y &= b\end{aligned}\]

      Setting up the augmented matrix and row reducing gives \[\left [ \begin{array}{cc|c} 1 & 1 & a \\ 1 & -1 & b \end{array} \right ] \rightarrow \cdots \rightarrow \left [ \begin{array}{cc|c} 1 & 0 & \frac{a+b}{2} \\ 0 & 1 & \frac{a-b}{2} \end{array} \right ]\nonumber\] This has a solution for all \(a,b\) and therefore \(T\) is onto.

    Therefore \(T\) is an isomorphism.

    An important property of isomorphisms is that its inverse is also an isomorphism.

    Proposition \(\PageIndex{1}\): Inverse of an Isomorphism

    Let \(T:V\rightarrow W\) be an isomorphism and \(V,W\) be subspaces of \(\mathbb{R}^n\). Then \(T^{-1}:W\rightarrow V\) is also an isomorphism.

    Proof

    Let \(T\) be an isomorphism. Since \(T\) is onto, a typical vector in \(W\) is of the form \(T(\vec{v})\) where \(\vec{v} \in V\). Consider then for \(a,b\) scalars, \[T^{-1}\left( aT(\vec{v}_{1})+bT(\vec{v}_{2})\right)\nonumber\] where \(\vec{v}_{1}, \vec{v}_2 \in V\). Is this equal to \[aT^{-1}\left( T (\vec{v}_{1})\right) +bT^{-1}\left( T(\vec{v}_{2})\right) =a\vec{v}_{1}+b\vec{v}_{2}?\nonumber\] Since \(T\) is one to one, this will be so if \[T\left( a\vec{v}_{1}+b\vec{v}_{2}\right) =T\left( T^{-1}\left( aT(\vec{v}_{1})+bT(\vec{v}_{2})\right) \right) =aT(\vec{v}_{1})+bT(\vec{v}_{2}).\nonumber\] However, the above statement is just the condition that \(T\) is a linear map. Thus \(T^{-1}\) is indeed a linear map. If \(\vec{v} \in V\) is given, then \(\vec{v}=T^{-1}\left( T(\vec{v})\right)\) and so \(T^{-1}\) is onto. If \(T^{-1} (\vec{v})=0,\) then \[\vec{v}=T\left( T^{-1}(\vec{v})\right) =T(\vec{0})=\vec{0}\nonumber\] and so \(T^{-1}\) is one to one.

    Another important result is that the composition of multiple isomorphisms is also an isomorphism.

    Proposition \(\PageIndex{2}\): Composition of Isomorphisms

    Let \(T:V\rightarrow W\) and \(S:W\rightarrow Z\) be isomorphisms where \(V,W,Z\) are subspaces of \(\mathbb{R}^n\). Then \(S\circ T\) defined by \(\left( S\circ T\right) \left( \vec{v} \right) = S\left( T\left( \vec{v} \right) \right)\) is also an isomorphism.

    Proof

    Suppose \(T:V\rightarrow W\) and \(S:W\rightarrow Z\) are isomorphisms. Why is \(S\circ T\) a linear map? For \(a,b\) scalars,

    \[\begin{aligned} S\circ T\left( a\vec{v}_{1}+b(\vec{v}_{2})\right) &= S\left( T\left(a\vec{v}_{1}+b\vec{v}_{2}\right) \right) =S\left( aT\vec{v}_{1}+bT\vec{v}_{2}\right) \\ &=aS\left( T\vec{v}_{1}\right) +bS\left( T\vec{v}_{2}\right) = a\left( S\circ T\right) \left( \vec{v}_{1}\right) +b\left( S\circ T\right) \left( \vec{v}_{2}\right)\end{aligned}\nonumber\]

    Hence \(S\circ T\) is a linear map. If \(\left( S\circ T\right) \left( \vec{v} \right) =0,\) then \(S\left( T\left( \vec{v} \right) \right) =0\) and it follows that \(T(\vec{v})=\vec{0}\) and hence by this lemma again, \(\vec{v}=\vec{0}\). Thus \(S\circ T\) is one to one. It remains to verify that it is onto. Let \(\vec{z} \in Z\). Then since \(S\) is onto, there exists \(\vec{w} \in W\) such that \(S(\vec{w})=\vec{z}.\) Also, since \(T\) is onto, there exists \(\vec{v}\in V\) such that \(T(\vec{v})=\vec{w}.\) It follows that \(S\left( T\left( \vec{v}\right) \right) =\vec{z}\) and so \(S\circ T\) is also onto.

    Consider two subspaces \(V\) and \(W\), and suppose there exists an isomorphism mapping one to the other. In this way the two subspaces are related, which we can write as \(V \sim W\). Then the previous two propositions together claim that \(\sim\) is an equivalence relation. That is: \(\sim\) satisfies the following conditions:

    • \(V\sim V\)
    • If \(V\sim W,\) it follows that \(W\sim V\)
    • If \(V\sim W\) and \(W\sim Z,\) then \(V\sim Z\)

    We leave the verification of these conditions as an exercise.

    Consider the following example.

    Example \(\PageIndex{2}\): Matrix Isomorphism

    Let \(T:\mathbb{R}^{n}\rightarrow \mathbb{R}^{n}\) be defined by \(T(\vec{x}) = A(\vec{x})\) where \(A\) is an invertible \(n\times n\) matrix. Then \(T\) is an isomorphism.

    Solution

    The reason for this is that, since \(A\) is invertible, the only vector it sends to \(\vec{0}\) is the zero vector. Hence if \(A(\vec{x})=A(\vec{y}),\) then \(A\left( \vec{x}-\vec{y}\right) =\vec{0}\) and so \(\vec{x}=\vec{y}\). It is onto because if

    \[\vec{y}\in \mathbb{R}^{n},A\left( A^{-1} (\vec{y})\right) =\left( AA^{-1}\right) (\vec{y}) =\vec{y}. \nonumber\]

    In fact, all isomorphisms from \(\mathbb{R}^{n}\) to \(\mathbb{R}^{n}\) can be expressed as \(T(\vec{x}) = A(\vec{x})\) where \(A\) is an invertible \(n \times n\) matrix. One simply considers the matrix whose \(i^{th}\) column is \(T\vec{e}_{i}\).

    Recall that a basis of a subspace \(V\) is a set of linearly independent vectors which span \(V\). The following fundamental lemma describes the relation between bases and isomorphisms.

    Lemma \(\PageIndex{1}\): Mapping Bases

    Let \(T:V\rightarrow W\) be a linear transformation where \(V,W\) are subspaces of \(\mathbb{R}^n\). If \(T\) is one to one, then it has the property that if \(\left\{ \vec{u}_{1},\cdots ,\vec{u}_{k}\right\}\) is linearly independent, so is \(\left\{ T(\vec{u}_{1}),\cdots ,T(\vec{u}_{k})\right\}\).

    More generally, \(T\) is an isomorphism if and only if whenever \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) is a basis for \(V,\) it follows that \(\left\{ T (\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) is a basis for \(W\).

    Proof

    First suppose that \(T\) is a linear transformation and is one to one and \(\left\{ \vec{u}_{1},\cdots ,\vec{u}_{k}\right\}\) is linearly independent. It is required to show that \(\left\{ T(\vec{u}_{1}),\cdots ,T(\vec{ u}_{k})\right\}\) is also linearly independent. Suppose then that \[\sum_{i=1}^{k}c_{i}T(\vec{u}_{i})=\vec{0}\nonumber\] Then, since \(T\) is linear, \[T\left( \sum_{i=1}^{n}c_{i}\vec{u}_{i}\right) =\vec{0}\nonumber\] Since \(T\) is one to one, it follows that \[\sum_{i=1}^{n}c_{i}\vec{u}_{i}=0\nonumber\] Now the fact that \(\left\{ \vec{u}_{1},\cdots ,\vec{u}_{n}\right\}\) is linearly independent implies that each \(c_{i}=0\). Hence \(\left\{ T(\vec{u} _{1}),\cdots ,T(\vec{u}_{n})\right\}\) is linearly independent.

    Now suppose that \(T\) is an isomorphism and \(\left\{ \vec{v}_{1},\cdots ,\vec{ v}_{n}\right\}\) is a basis for \(V\). It was just shown that \(\left\{ T(\vec{v} _{1}),\cdots ,T(\vec{v}_{n})\right\}\) is linearly independent. It remains to verify that span\(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}=W\). If \(\vec{w}\in W,\) then since \(T\) is onto there exists \(\vec{v}\in V\) such that \(T(\vec{v})=\vec{w}\). Since \(\left\{ \vec{v} _{1},\cdots ,\vec{v}_{n}\right\}\) is a basis, it follows that there exists scalars \(\left\{ c_{i}\right\} _{i=1}^{n}\) such that \[\sum_{i=1}^{n}c_{i}\vec{v}_{i}=\vec{v}.\nonumber \] Hence, \[\vec{w}=T(\vec{v})=T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) =\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})\nonumber \] It follows that span\(\left\{ T(\vec{v}_{1}),\cdots , T(\vec{v}_{n})\right\} =W\) showing that this set of vectors is a basis for \(W\).

    Next suppose that \(T\) is a linear transformation which takes a basis to a basis. This means that if \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) is a basis for \(V,\) it follows \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) is a basis for \(W.\) Then if \(w\in W,\) there exist scalars \(c_{i}\) such that \(w=\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right)\) showing that \(T\) is onto. If \(T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) =\vec{0}\) then \(\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=\vec{0}\) and since the vectors \(\left\{ T(\vec{v} _{1}),\cdots ,T(\vec{v}_{n})\right\}\) are linearly independent, it follows that each \(c_{i}=0.\) Since \(\sum_{i=1}^{n}c_{i}\vec{v}_{i}\) is a typical vector in \(V\), this has shown that if \(T(\vec{v})=\vec{0}\) then \(\vec{v}=\vec{0}\) and so \(T\) is also one to one. Thus \(T\) is an isomorphism.

    The following theorem illustrates a very useful idea for defining an isomorphism. Basically, if you know what it does to a basis, then you can construct the isomorphism.

    Theorem \(\PageIndex{1}\): Isomorphic Subspaces

    Suppose \(V\) and \(W\) are two subspaces of \(\mathbb{R}^n\). Then the two subspaces are isomorphic if and only if they have the same dimension. In the case that the two subspaces have the same dimension, then for a linear map \(T:V\rightarrow W\), the following are equivalent.

    1. \(T\) is one to one.
    2. \(T\) is onto.
    3. \(T\) is an isomorphism.
    Proof

    Suppose first that these two subspaces have the same dimension. Let a basis for \(V\) be \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) and let a basis for \(W\) be \(\left\{ \vec{w}_{1},\cdots ,\vec{w}_{n}\right\}\). Now define \(T\) as follows. \[T(\vec{v}_{i})=\vec{w}_{i}\nonumber\] for \(\sum_{i=1}^{n}c_{i}\vec{v}_{i}\) an arbitrary vector of \(V,\) \[T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) = \sum_{i=1}^{n}c_{i}T \vec{v}_{i}=\sum_{i=1}^{n}c_{i}\vec{w}_{i}.\nonumber\] It is necessary to verify that this is well defined. Suppose then that \[\sum_{i=1}^{n}c_{i}\vec{v}_{i}=\sum_{i=1}^{n}\hat{c}_{i}\vec{v}_{i}\nonumber\] Then \[\sum_{i=1}^{n}\left( c_{i}-\hat{c}_{i}\right) \vec{v}_{i}=\vec{0}\nonumber\] and since \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) is a basis, \(c_{i}=\hat{c}_{i}\) for each \(i\). Hence \[\sum_{i=1}^{n}c_{i}\vec{w}_{i}=\sum_{i=1}^{n}\hat{c}_{i}\vec{w}_{i}\nonumber\] and so the mapping is well defined. Also if \(a,b\) are scalars, \[\begin{aligned} T\left( a\sum_{i=1}^{n}c_{i}\vec{v}_{i}+b\sum_{i=1}^{n}\hat{c}_{i}\vec{v}_{i}\right) &=T\left( \sum_{i=1}^{n}\left( ac_{i}+b\hat{c}_{i}\right) \vec{v}_{i}\right) =\sum_{i=1}^{n}\left( ac_{i}+b\hat{c}_{i}\right) \vec{w}_{i} \\ &=a\sum_{i=1}^{n}c_{i}\vec{w}_{i}+b\sum_{i=1}^{n}\hat{c}_{i}\vec{w}_{i} \\ &=aT\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) +bT\left( \sum_{i=1}^{n} \hat{c}_{i}\vec{v}_{i}\right)\end{aligned}\] Thus \(T\) is a linear transformation.

    Now if \[T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) =\sum_{i=1}^{n}c_{i}\vec{w}_{i}=\vec{0},\nonumber \] then since the \(\left\{ \vec{w}_{1},\cdots ,\vec{w}_{n}\right\}\) are independent, each \(c_{i}=0\) and so \(\sum_{i=1}^{n}c_{i}\vec{v}_{i}=\vec{0}\) also. Hence \(T\) is one to one. If \(\sum_{i=1}^{n}c_{i}\vec{w}_{i}\) is a vector in \(W,\) then it equals \[\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right)\nonumber \] showing that \(T\) is also onto. Hence \(T\) is an isomorphism and so \(V\) and \(W\) are isomorphic.

    Next suppose \(T:V \mapsto W\) is an isomorphism, so these two subspaces are isomorphic. Then for \(\left\{ \vec{v}_{1},\cdots ,\vec{v}_{n}\right\}\) a basis for \(V\), it follows that a basis for \(W\) is \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) showing that the two subspaces have the same dimension.

    Now suppose the two subspaces have the same dimension. Consider the three claimed equivalences.

    First consider the claim that \(1.)\Rightarrow 2.).\) If \(T\) is one to one and if \(\left\{ \vec{v}_{1},\cdots ,\vec{v} _{n}\right\}\) is a basis for \(V,\) then \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v }_{n})\right\}\) is linearly independent. If it is not a basis, then it must fail to span \(W\). But then there would exist \(\vec{w}\notin \mathrm{span} \left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) and it follows that \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n}),\vec{w} \right\}\) would be linearly independent which is impossible because there exists a basis for \(W\) of \(n\) vectors.

    Hence \(\mathrm{span}\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\} =W\) and so \(\left\{ T(\vec{v}_{1}),\cdots ,T(\vec{v}_{n})\right\}\) is a basis. If \(\vec{w}\in W,\) there exist scalars \(c_{i}\) such that \[\vec{w}=\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=T\left( \sum_{i=1}^{n}c_{i}\vec{v} _{i}\right)\nonumber \] showing that \(T\) is onto. This shows that \(1.)\Rightarrow 2.).\)

    Next consider the claim that \(2.)\Rightarrow 3.).\) Since \(2.)\) holds, it follows that \(T\) is onto. It remains to verify that \(T\) is one to one. Since \(T\) is onto, there exists a basis of the form \(\left\{ T(\vec{v}_{i}),\cdots ,T(\vec{v}_{n})\right\} .\) Then it follows that \(\left\{ \vec{v}_{1},\cdots , \vec{v}_{n}\right\}\) is linearly independent. Suppose \[\sum_{i=1}^{n}c_{i}\vec{v}_{i}=\vec{0}\nonumber\] Then \[\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=\vec{0}\nonumber\] Hence each \(c_{i}=0\) and so, \(\left\{ \vec{v}_{1},\cdots ,\vec{v} _{n}\right\}\) is a basis for \(V\). Now it follows that a typical vector in \(V\) is of the form \(\sum_{i=1}^{n}c_{i}\vec{v}_{i}\). If \(T\left( \sum_{i=1}^{n}c_{i}\vec{v}_{i}\right) =\vec{0},\) it follows that \[\sum_{i=1}^{n}c_{i}T(\vec{v}_{i})=\vec{0}\nonumber\] and so, since \(\left\{ T(\vec{v}_{i}),\cdots ,T(\vec{v}_{n})\right\}\) is independent, it follows each \(c_{i}=0\) and hence \(\sum_{i=1}^{n}c_{i}\vec{v} _{i}=\vec{0}\). Thus \(T\) is one to one as well as onto and so it is an isomorphism.

    If \(T\) is an isomorphism, it is both one to one and onto by definition so \(3.)\) implies both \(1.)\) and \(2.)\).

    Note the interesting way of defining a linear transformation in the first part of the argument by describing what it does to a basis and then “extending it linearly” to the entire subspace.

    Example \(\PageIndex{4}\): Isomorphic Subspaces

    Let \(V=\mathbb{R}^{3}\) and let \(W\) denote \[\mathrm{span}\left\{ \left [ \begin{array}{r} 1 \\ 2 \\ 1 \\ 1 \end{array} \right ] ,\left [ \begin{array}{r} 0 \\ 1 \\ 0 \\ 1 \end{array} \right ] ,\left [ \begin{array}{r} 1 \\ 1 \\ 2 \\ 0 \end{array} \right ] \right\}\nonumber \] Show that \(V\) and \(W\) are isomorphic.

    Solution

    First observe that these subspaces are both of dimension 3 and so they are isomorphic by Theorem \(\PageIndex{1}\). The three vectors which span \(W\) are easily seen to be linearly independent by making them the columns of a matrix and row reducing to the reduced row-echelon form.

    You can exhibit an isomorphism of these two spaces as follows. \[T(\vec{e}_{1})=\left [ \begin{array}{c} 1 \\ 2 \\ 1 \\ 1 \end{array} \right ], T(\vec{e}_{2})=\left [ \begin{array}{c} 0 \\ 1 \\ 0 \\ 1 \end{array} \right ], T(\vec{e}_{3})=\left [ \begin{array}{c} 1 \\ 1 \\ 2 \\ 0 \end{array} \right ]\nonumber \] and extend linearly. Recall that the matrix of this linear transformation is just the matrix having these vectors as columns. Thus the matrix of this isomorphism is \[\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 2 & 1 & 1 \\ 1 & 0 & 2 \\ 1 & 1 & 0 \end{array} \right ]\nonumber \] You should check that multiplication on the left by this matrix does reproduce the claimed effect resulting from an application by \(T\).

    Consider the following example.

    Example \(\PageIndex{5}\): Finding the Matrix of an Isomorphism

    Let \(V=\mathbb{R}^{3}\) and let \(W\) denote

    \[\mathrm{span}\left\{ \left [ \begin{array}{c} 1 \\ 2 \\ 1 \\ 1 \end{array} \right ] ,\left [ \begin{array}{c} 0 \\ 1 \\ 0 \\ 1 \end{array} \right ] ,\left [ \begin{array}{c} 1 \\ 1 \\ 2 \\ 0 \end{array} \right ] \right\}\nonumber \]

    Let \(T: V \mapsto W\) be defined as follows. \[T\left [ \begin{array}{c} 1 \\ 1 \\ 0 \end{array} \right ] =\left [ \begin{array}{c} 1 \\ 2 \\ 1 \\ 1 \end{array} \right ] ,T\left [ \begin{array}{c} 0 \\ 1 \\ 1 \end{array} \right ] =\left [ \begin{array}{c} 0 \\ 1 \\ 0 \\ 1 \end{array} \right ] ,T\left [ \begin{array}{c} 1 \\ 1 \\ 1 \end{array} \right ] =\left [ \begin{array}{c} 1 \\ 1 \\ 2 \\ 0 \end{array} \right ]\nonumber \] Find the matrix of this isomorphism \(T\).

    Solution

    First note that the vectors \[\left [ \begin{array}{c} 1 \\ 1 \\ 0 \end{array} \right ] ,\left [ \begin{array}{c} 0 \\ 1 \\ 1 \end{array} \right ] ,\left [ \begin{array}{c} 1 \\ 1 \\ 1 \end{array} \right ]\nonumber \] are indeed a basis for \(\mathbb{R}^{3}\) as can be seen by making them the columns of a matrix and using the reduced row-echelon form.

    Now recall the matrix of \(T\) is a \(4\times 3\) matrix \(A\) which gives the same effect as \(T.\) Thus, from the way we multiply matrices, \[A\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{array} \right ] =\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 2 & 1 & 1 \\ 1 & 0 & 2 \\ 1 & 1 & 0 \end{array} \right ]\nonumber \] Hence, \[A=\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 2 & 1 & 1 \\ 1 & 0 & 2 \\ 1 & 1 & 0 \end{array} \right ] \left [ \begin{array}{rrr} 1 & 0 & 1 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{array} \right ] ^{-1}=\left [ \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 2 & -1 \\ 2 & -1 & 1 \\ -1 & 2 & -1 \end{array} \right ]\nonumber \]

    Note how the span of the columns of this new matrix must be the same as the span of the vectors defining \(W\).

    This idea of defining a linear transformation by what it does on a basis works for linear maps which are not necessarily isomorphisms.

    Example \(\PageIndex{6}\): Finding the Matrix of an Isomorphism

    Let \(V=\mathbb{R}^{3}\) and let \(W\) denote \[\mathrm{span}\left\{ \left [ \begin{array}{c} 1 \\ 0 \\ 1 \\ 1 \end{array} \right ] ,\left [ \begin{array}{c} 0 \\ 1 \\ 0 \\ 1 \end{array} \right ] ,\left [ \begin{array}{c} 1 \\ 1 \\ 1 \\ 2 \end{array} \right ] \right\}\nonumber \] Let \(T: V \mapsto W\) be defined as follows. \[T\left [ \begin{array}{c} 1 \\ 1 \\ 0 \end{array} \right ] = \left [ \begin{array}{c} 1 \\ 0 \\ 1 \\ 1 \end{array} \right ] ,T\left [ \begin{array}{c} 0 \\ 1 \\ 1 \end{array} \right ] =\left [ \begin{array}{c} 0 \\ 1 \\ 0 \\ 1 \end{array} \right ] ,T\left [ \begin{array}{c} 1 \\ 1 \\ 1 \end{array} \right ] =\left [ \begin{array}{c} 1 \\ 1 \\ 1 \\ 2 \end{array} \right ]\nonumber \] Find the matrix of this linear transformation.

    Solution

    Note that in this case, the three vectors which span \(W\) are not linearly independent. Nevertheless the above procedure will still work. The reasoning is the same as before. If \(A\) is this matrix, then \[A\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{array} \right ] =\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 2 \end{array} \right ]\nonumber \] and so \[A=\left [ \begin{array}{rrr} 1 & 0 & 1 \\ 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 2 \end{array} \right ] \left [ \begin{array}{rrr} 1 & 0 & 1 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{array} \right ] ^{-1}=\left [ \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \\ 1 & 0 & 1 \end{array} \right ]\nonumber \]

    The columns of this last matrix are obviously not linearly independent.


    This page titled 5.6: Isomorphisms is shared under a CC BY 4.0 license and was authored, remixed, and/or curated by Ken Kuttler (Lyryx) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.