Skip to main content
Mathematics LibreTexts

2.9: More on Matrix Inverses

  • Page ID
    19847
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    In this section, we will prove three theorems which will clarify the concept of matrix inverses. In order to do this, first recall some important properties of elementary matrices.

    Recall that an elementary matrix is a square matrix obtained by performing an elementary operation on an identity matrix. Each elementary matrix is invertible, and its inverse is also an elementary matrix. If \(E\) is an \(m \times m\) elementary matrix and \(A\) is an \(m \times n\) matrix, then the product \(EA\) is the result of applying to \(A\) the same elementary row operation that was applied to the \(m \times m\) identity matrix in order to obtain \(E\).

    Let \(R\) be the reduced row-echelon form of an \(m \times n\) matrix \(A\). \(R\) is obtained by iteratively applying a sequence of elementary row operations to \(A\). Denote by \(E_1, E_2, \cdots, E_k\) the elementary matrices associated with the elementary row operations which were applied, in order, to the matrix \(A\) to obtain the resulting \(R\). We then have that \(R = \left( E_k \cdots \left( E_2 \left( E_1A \right) \right)\right) = E_k \cdots E_2E_1A\). Let \(E\) denote the product matrix \(E_k \cdots E_2E_1\) so that we can write \(R=EA\) where \(E\) is an invertible matrix whose inverse is the product \((E_1)^{-1}(E_2)^{-1} \cdots (E_k)^{-1}\).

    Now, we will consider some preliminary lemmas.

    Lemma \(\PageIndex{1}\): Invertible Matrix and Zeros

    Suppose that \(A\) and \(B\) are matrices such that the product \(AB\) is an identity matrix. Then the reduced row-echelon form of \(A\) does not have a row of zeros.

    Proof

    Let \(R\) be the reduced row-echelon form of \(A\). Then \(R=EA\) for some invertible square matrix \(E\) as described above. By hypothesis \(AB=I\) where \(I\) is an identity matrix, so we have a chain of equalities \[R(BE^{-1}) = (EA)(BE^{-1}) = E(AB)E^{-1} = EIE^{-1} = EE^{-1} = I\nonumber \] If \(R\) would have a row of zeros, then so would the product \(R(BE^{-1})\). But since the identity matrix \(I\) does not have a row of zeros, neither can \(R\) have one.

    We now consider a second important lemma.

    Lemma \(\PageIndex{2}\): Size of Invertible Matrix

    Suppose that \(A\) and \(B\) are matrices such that the product \(AB\) is an identity matrix. Then \(A\) has at least as many columns as it has rows.

    Proof

    Let \(R\) be the reduced row-echelon form of \(A\). By Lemma \(\PageIndex{1}\), we know that \(R\) does not have a row of zeros, and therefore each row of \(R\) has a leading \(1\). Since each column of \(R\) contains at most one of these leading \(1\)s, \(R\) must have at least as many columns as it has rows.

    An important theorem follows from this lemma.

    Theorem \(\PageIndex{1}\): Invertible Matrices are Square

    Only square matrices can be invertible.

    Proof

    Suppose that \(A\) and \(B\) are matrices such that both products \(AB\) and \(BA\) are identity matrices. We will show that \(A\) and \(B\) must be square matrices of the same size. Let the matrix \(A\) have \(m\) rows and \(n\) columns, so that \(A\) is an \(m \times n\) matrix. Since the product \(AB\) exists, \(B\) must have \(n\) rows, and since the product \(BA\) exists, \(B\) must have \(m\) columns so that \(B\) is an \(n \times m\) matrix. To finish the proof, we need only verify that \(m=n\).

    We first apply Lemma \(\PageIndex{2}\) with \(A\) and \(B\), to obtain the inequality \(m \leq n\). We then apply Lemma \(\PageIndex{2}\) again (switching the order of the matrices), to obtain the inequality \(n \leq m\). It follows that \(m=n\), as we wanted.

    Of course, not all square matrices are invertible. In particular, zero matrices are not invertible, along with many other square matrices.

    The following proposition will be useful in proving the next theorem.

    Proposition \(\PageIndex{1}\): Reduced Row-Echelon Form of a Square Matrix

    If \(R\) is the reduced row-echelon form of a square matrix, then either \(R\) has a row of zeros or \(R\) is an identity matrix.

    The proof of this proposition is left as an exercise to the reader. We now consider the second important theorem of this section.

    Theorem \(\PageIndex{2}\): Unique Inverse of a Matrix

    Suppose \(A\) and \(B\) are square matrices such that \(AB=I\) where \(I\) is an identity matrix. Then it follows that \(BA=I\). Further, both \(A\) and \(B\) are invertible and \(B=A^{-1}\) and \(A=B^{-1}\).

    Proof

    Let \(R\) be the reduced row-echelon form of a square matrix \(A\). Then, \(R=EA\) where \(E\) is an invertible matrix. Since \(AB=I\), Lemma \(\PageIndex{1}\) gives us that \(R\) does not have a row of zeros. By noting that \(R\) is a square matrix and applying Proposition \(\PageIndex{1}\), we see that \(R=I\). Hence, \(EA=I\).

    Using both that \(EA=I\) and \(AB=I\), we can finish the proof with a chain of equalities as given by \[\begin{aligned} BA = IBIA &= (EA)B(E^{-1}E)A \\ &= E(AB)E^{-1}(EA) \\ &= EIE^{-1}I \\ &= EE^{-1} = I\end{aligned}\]

    It follows from the definition of the inverse of a matrix that \(B=A^{-1}\) and \(A=B^{-1}\).

    This theorem is very useful, since with it we need only test one of the products \(AB\) or \(BA\) in order to check that \(B\) is the inverse of \(A\). The hypothesis that \(A\) and \(B\) are square matrices is very important, and without this the theorem does not hold.

    We will now consider an example.

    Example \(\PageIndex{1}\): Non Square Matrices

    Let \[A = \left[ \begin{array}{rr} 1 & 0 \\ 0 & 1 \\ 0 & 0 \end{array} \right],\nonumber \] Show that \(A^{T}A = I\) but \(AA^{T} \neq 0\).

    Solution

    Consider the product \(A^{T}A\) given by \[\left[ \begin{array}{rrr} 1 & 0 & 0\\ 0 & 1 & 0 \end{array} \right] \left[ \begin{array}{rr} 1 & 0 \\ 0 & 1 \\ 0 & 0 \end{array} \right] = \left[ \begin{array}{rr} 1 & 0 \\ 0 & 1 \end{array} \right]\] Therefore, \(A^{T}A = I_2\), where \(I_2\) is the \(2 \times 2\) identity matrix. However, the product \(AA^{T}\) is \[\left[ \begin{array}{rr} 1 & 0 \\ 0 & 1 \\ 0 & 0 \end{array} \right] \left[ \begin{array}{rrr} 1 & 0 & 0\\ 0 & 1 & 0 \end{array} \right] = \left[ \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{array} \right]\] Hence \(AA^{T}\) is not the \(3 \times 3\) identity matrix. This shows that for Theorem \(\PageIndex{2}\), it is essential that both matrices be square and of the same size.

    Is it possible to have matrices \(A\) and \(B\) such that \(AB=I\), while \(BA=0\)? This question is left to the reader to answer, and you should take a moment to consider the answer.

    We conclude this section with an important theorem.

    Theorem \(\PageIndex{3}\): The reduced row-echelon form of an Invertible Matrix

    For any matrix \(A\) the following conditions are equivalent:

    • \(A\) is invertible
    • The reduced row-echelon form of \(A\) is an identity matrix
    Proof

    In order to prove this, we show that for any given matrix \(A\), each condition implies the other. We first show that if \(A\) is invertible, then its reduced row-echelon form is an identity matrix, then we show that if the reduced row-echelon form of \(A\) is an identity matrix, then \(A\) is invertible.

    If \(A\) is invertible, there is some matrix \(B\) such that \(AB = I\). By Lemma \(\PageIndex{1}\), we get that the of \(A\) does not have a row of zeros. Then by Theorem \(\PageIndex{1}\), it follows that \(A\) and the reduced row-echelon form of \(A\) are square matrices. Finally, by Proposition \(\PageIndex{1}\), this reduced row-echelon form of \(A\) must be an identity matrix. This proves the first implication.

    Now suppose the reduced row-echelon form of \(A\) is an identity matrix \(I\). Then \(I=EA\) for some product \(E\) of elementary matrices. By Theorem \(\PageIndex{2}\), we can conclude that \(A\) is invertible.

    Theorem \(\PageIndex{3}\) corresponds to Algorithm 2.7.1, which claims that \(A^{-1}\) is found by row reducing the augmented matrix \(\left[ A|I\right]\) to the form \(\left[ I|A^{-1}\right]\). This will be a matrix product \(E\left[ A|I\right]\) where \(E\) is a product of elementary matrices. By the rules of matrix multiplication, we have that \(E\left[ A|I\right] = \left[ EA|EI\right] = \left[ EA|E\right]\).

    It follows that the reduced row-echelon form of \(\left[ A|I\right]\) is \(\left[ EA|E\right]\), where \(EA\) gives the reduced row-echelon form of \(A\). By Theorem \(\PageIndex{3}\), if \(EA \neq I\), then \(A\) is not invertible, and if \(EA=I\), \(A\) is invertible. If \(EA=I\), then by Theorem \(\PageIndex{2}\), \(E=A^{-1}\). This proves that Algorithm 2.7.1 does in fact find \(A^{-1}\).


    This page titled 2.9: More on Matrix Inverses is shared under a CC BY 4.0 license and was authored, remixed, and/or curated by Ken Kuttler (Lyryx) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.

    • Was this article helpful?