
# A.5 Special operations on matrices

$$\newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} }$$

$$\newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$

In this section, we deﬁne three important operations on matrices called the transpose, conjugate transpose, and the trace. These will then be seen to interact with matrix multiplication and invertibility in order to form special classes of matrices that are extremely important to applications of Linear Algebra.

### A.5.1 Transpose and conjugate transpose

Given positive integers $$m, n \in \mathbb{Z}_+$$ and any matrix $$A = (a_{ij} ) \in \mathbb{F}^{m \times n} ,$$ we deﬁne the transpose $$A^T = ((a^T )_{ij} ) \in \mathbb{F}^{n \times m}$$ and the conjugate transpose $$A^{\ast} = ((a^{\ast} )_{ij} ) \in \mathbb{F}^{n \times m}$$ by

$(a^T )_{ij} = a_{ji} \rm{~and~} (a^{\ast} )_{ij} = \overline{a_{ji}} ,$

where $$\overline{a_{ji}}$$ denotes the complex conjugate of the scalar $$a_{ji} \in \mathbb{F}.$$ In particular, if $$A \in \mathbb{R}^{m \times n}$$,then note that $$A^T = A^{\ast} .$$

Example $$\PageIndex{1}$$:

With notation as in Example A.1.3,

$A^T= \left[ \begin{array}{ccc} 3 & -1 & 1 \end{array} \right],B^T=\left[ \begin{array}{cc} 4 & 0 \\ -1 & 2 \end{array} \right], C^T = \left[ \begin{array}{c} 1 \\ 4 \\ 2 \end{array} \right],\\ D^T= \left[ \begin{array}{ccc} 1 & -1 & 3 \\ 5 & 0 & 2 \\ 2 & 1 & 4 \end{array} \right], E^T = \left[ \begin{array}{ccc} 6 & -1 & 4 \\ 1 & 1 & 1 \\3 & 2 & 3 \end{array} \right].$

One of the motivations for deﬁning the operations of transpose and conjugate transpose is that they interact with the usual arithmetic operations on matrices in a natural manner. We summarize the most fundamental of these interactions in the following theorem.

Theorem A.5.2. Given positive integers $$m, n \in \mathbb{Z}_+$$ and any matrices $$A, B \in \mathbb{F}^{m \times n} ,$$

1. $$(A^T )^T = A {\it{~and~}} (A^{\ast} )^{\ast} = A.$$
2. $$(A + B)^T = A^T + B^T {\it{~and~}} (A + B)^{\ast} = A^{\ast} + B^{\ast} .$$
3. $$(\alpha A)^T = \alpha A^T {\it{~and~}} (\alpha A)^{\ast} = \alpha A^{\ast} ,$$ where $$\alpha \in \mathbb{F}$$ is any scalar.
4. $$(AB)^T = B^T A^T .$$
5. if $$m = n$$ and $$A \in GL(n, \mathbb{F})$$, then $$A^T , A^* \in GL(n, \mathbb{F})$$ with respective inverses given by

$(A^T )^{-1} = (A^{-1} )^T {\it{~and~}} (A^* )^{-1} = (A^{-1} )^* .$

Another motivation for deﬁning the transpose and conjugate transpose operations is that they allow us to deﬁne several very special classes of matrices.

Deﬁnition A.5.3. Given a positive integer $$n \in \mathbb{Z}_+$$ , we say that the square matrix $$A \in \mathbb{F}^{n \times n}$$

1. is symmetric if $$A = A^T .$$
2. is Hermitian if $$A = A^* .$$
3. is orthogonal if $$A \in GL(n, \mathbb{R})$$ and $$A^{-1} = A^T .$$ Moreover, we deﬁne the (real) orthogonal group to be the set $$O(n) = \{A \in GL(n, \mathbb{R})~ |~ A^{-1} = A^T \}.$$
4. is unitary if $$A \in GL(n, \mathbb{C})$$ and $$A^{-1} = A^*$$. Moreover, we deﬁne the (complex)

unitary group to be the set $$U(n) = \{A \in GL(n, \mathbb{C}) ~|~ A^{-1 }= A^* \}.$$
A lot can be said about these classes of matrices. Both $$O(n)$$ and $$U(n)$$, for example, form a group under matrix multiplication. Additionally, real symmetric and complex Hermitian matrices always have real eigenvalues. Moreover, given any matrix $$A \in \mathbb{R}^{m \times n} , AA^T$$ is a symmetric matrix with real, non-negative eigenvalues. Similarly, for $$A \in \mathbb{C}^{m \times n} , AA^*$$ is Hermitian with real, non-negative eigenvalues.

### A.5.2 The trace of a square matrix

Given a positive integer $$n \in \mathbb{Z}_+$$ and any square matrix $$A = (a_{ij} ) \in \mathbb{F}^{n \times n} ,$$ we deﬁne the trace of $$A$$ to be the scalar

$trace(A) = \sum_{k=1}^n a_{kk} \in \mathbb{F}.$

Example $$\PageIndex{2}$$

With notation as in Example A.1.3 above,

$trace(B) = 4 + 2 = 6, trace(D) = 1 + 0 + 4 = 5, {\it{~and~}} trace(E) = 6 + 1 + 3 = 10.$

Note, in particular, that the traces of $$A$$ and $$C$$ are not deﬁned since these are not square matrices.

We summarize some of the most basic properties of the trace operation in the following theorem, including its connection to the transpose operations deﬁned in the previous section.

Theorem A.5.5. Given positive integers $$m, n \in \mathbb{Z}_+$$ and square matrices $$A, B \in \mathbb{F}^{n \times n} ,$$

1. $$trace(\alpha A) = \alpha trace(A),$$ for any scalar $$\alpha \in \mathbb{F}.$$
2. $$trace(A + B) = trace(A) + trace(B).$$
3. $$trace(A^T ) = trace(A) {\it{~and~}} trace(A^* ) = \overline{trace(A)}.$$
4. $$trace(AA^* ) = \sum_{k=1}^{n} \sum_{l=1}^{n} |a_{kl} |^2$$. In particular, $$trace(AA^* ) = 0$$ if and only if $$A = 0^{n \times n}$$ .
5. $$trace(AB) = trace(BA).$$ More generally, given matrices $$A_1 , \ldots , A_m \in \mathbb{F}^{n \times n} ,$$ the trace operation has the so-called cyclic property, meaning that $trace(A_1 \cdots A_m ) = trace(A_2 \cdots A_m A_1 ) = \cdots = trace(A_m A_1 \cdots A_{m-1} ).$

Moreover, if we deﬁne a linear map $$T : \mathbb{F}^n \rightarrow \mathbb{F}^n$$ by setting $$T (v) = Av$$ for each $$v \in \mathbb{F}^n$$ and if $$T$$ has distinct eigenvalues $$\lambda_1 , \ldots , \lambda_n$$, then $$trace(A) = \sum_{k=1}^n \lambda_{k} .$$

### Contributors

Both hardbound and softbound versions of this textbook are available online at WorldScientific.com.