3: Constrained Extrema of Quadratic Forms
( \newcommand{\kernel}{\mathrm{null}\,}\)
In this section it is convenient to write {\bf X}= \left[\begin{array}{ccccccc} x_{1}\\x_{2}\\\vdots\\x_{n} \end{array}\right]. \nonumber
An eigenvalue of a square matrix \mathbf{A} = [a_{ij}]_{i,j=1}^{n} is a number \lambda such that the system
\mathbf{A}\mathbf{X} = \lambda \mathbf{X}, \nonumber or, equivalently, (\mathbf{A}-\lambda \mathbf{I})\mathbf{X}=\mathbf{0}, \nonumber
has a solution \mathbf{X} \ne \mathbf{0}. Such a solution is called an eigenvector of \mathbf{A}. You probably know from linear algebra that \lambda is an eigenvalue of \mathbf{A} if and only if
\det(\mathbf{A} -\lambda \mathbf{I}) = {0}. \nonumber
Henceforth we assume that \mathbf{A} is symmetric (a_{ij} = a_{ji}, 1 \le i, j \le n). In this case,
\det(\mathbf{A}-\lambda \mathbf{I}) = (-1)^{n}(\lambda-\lambda_{1})(\lambda-\lambda_2) \cdots (\lambda-\lambda_{n}), \nonumber
where \lambda_{1},\lambda_2,\dots,\lambda_{n} are real numbers.
The function Q(\mathbf{X}) = \sum^{n}_{i,j=1} a_{ij} x_{i}x_{j} \nonumber is a quadratic form. To find its maximum or minimum subject to \displaystyle{\sum^{n}_{i=1} x^{2}_{i}=1}, we form the Lagrangian
L=Q(\mathbf{X}) - \lambda \sum^{n}_{i=1}x^{2}_{i}. \nonumber
Then
L_{x_{i}}= 2 \sum^{n}_{j=1} a_{ij}x_{j} - 2\lambda x_{i}=0, \quad 1 \le i \le n, \nonumber
so
\sum_{j=1}^{n}a_{ij}x_{j0}=\lambda x_{i0},\quad 1\le i\le n. \nonumber
Therefore, \mathbf{X_{0}} is a constrained critical point of Q subject to \displaystyle{\sum^{n}_{i=1} x^{2}_{i}=1} if and only if {\mathbf A}{\mathbf X}_{0}=\lambda{\mathbf X}_{0} for some \lambda; that is, if and only if \lambda is an eigenvalue and \mathbf{X}_{0} is an associated unit eigenvector of \mathbf{A}. If {\mathbf A}\mathbf{X}_{0}={\bf X}_{0} and \displaystyle{\sum_{i}^{n}x_{i0}^{2}}=1, then
\begin{aligned} Q(\mathbf{X}_{0}) & =& \sum^{n}_{i=1} \left(\sum^{n}_{j=1} a_{ij}x_{j0} \right) x_{i0} = \sum^{n}_{i=1} (\lambda x_{i0})x_{i0} \\ & =& \lambda \sum^{n}_{i=1} x^{2}_{i0} = \lambda;\end{aligned} \nonumber
therefore, the largest and smallest eigenvalues of {\bf A} are the maximum and minimum values of Q subject to \displaystyle{\sum_{i=1}^{n}x_{i}^{2}}=1.
Find the maximum and minimum values
Q(\mathbf{X}) = x^{2}+y^{2}+2z^{2}-2xy + 4xz + 4yz \nonumber subject to the constraint \label{eq:18} x^{2}+y^{2}+z^{2}=1.
Solution
The matrix of Q is
\mathbf{A} = \left[\begin{array}{rrrrr} 1 & -1 & 2 \\ -1 & 1 & 2 \\ 2&2&2 \end{array}\right] \nonumber
and
\begin{aligned} \det(\mathbf{A} - \lambda \mathbf{I}) & =& \left|\begin{array}{ccccccc} 1-\lambda &-1 & 2 \\ -1 & 1-\lambda & 2 \\ 2 & 2 & 2-\lambda \end{array}\right| \\ & =& -(\lambda+2)(\lambda-2)(\lambda-4),\end{aligned} \nonumber
so
\lambda_{1}=4, \quad \lambda_2=2, \quad \lambda_3=-2 \nonumber
are the eigenvalues of \mathbf{A}. Hence, \lambda_{1}=4 and \lambda_3 = -2 are the maximum and minimum values of Q subject to Equation \ref{eq:18}.
To find the points (x_{1},y_{1},z_{1}) where Q attains its constrained maximum, we first find an eigenvector of {\bf A} corresponding to \lambda_{1}=4. To do this, we find a nontrivial solution of the system
(\mathbf{A}-4\mathbf{I}) \left[\begin{array}{ccccccc} x_{1}\\ y_{1}\\ z_{1} \end{array}\right]= \left[\begin{array}{ccccccc} -3 & -1 & \phantom{-}2 \\ -1 & -3 & \phantom{-}2 \\ \phantom{-}2 & \phantom{-}2 & -2 \end{array}\right] \left[\begin{array}{ccccccc} x_{1}\\y_{1}\\z_{1} \end{array}\right]= \left[\begin{array}{ccccccc} 0\\0\\0 \end{array}\right]. \nonumber
All such solutions are multiples of \left[\begin{array}{ccccccc} 1\\1\\2 \end{array}\right]. Normalizing this to satisfy Equation \ref{eq:18} yields
{\bf X}_{1}=\frac{1}{\sqrt6} \left[\begin{array}{ccccccc} x_{1}\\y_{1}\\z_{1} \end{array}\right]=\pm \left[\begin{array}{ccccccc} 1\\ 1\\1 \end{array}\right]. \nonumber
To find the points (x_{3},y_{3},z_{3}) where Q attains its constrained minimum, we first find an eigenvector of {\bf A} corresponding to \lambda_{3}=-2. To do this, we find a nontrivial solution of the system
(\mathbf{A}+2\mathbf{I}) \left[\begin{array}{ccccccc} x_{3}\\ y_{3}\\ z_{3} \end{array}\right]= \left[\begin{array}{rrrcccc} 3 & -1 & \phantom{-}2 \\ -1 & 3 & \phantom{-}2 \\ \phantom{-}2 & \phantom{-}2 & 4 \end{array}\right] \left[\begin{array}{ccccccc} x_{3}\\y_{3}\\z_{3} \end{array}\right]= \left[\begin{array}{ccccccc} 0\\0\\0 \end{array}\right]. \nonumber
All such solutions are multiples of \left[\begin{array}{rcccccc} 1\\1\\-1 \end{array}\right]. Normalizing this to satisfy Equation \ref{eq:18} yields
{\bf X}_{3}= \left[\begin{array}{ccccccc} x_{2}\\y_{2}\\z_{2} \end{array}\right]=\pm \frac{1}{\sqrt{3}} \left[\begin{array}{rcccccc} 1\\ 1\\-1 \end{array}\right]. \nonumber
As for the eigenvalue \lambda_{2}=2, we leave it you to verify that the only unit vectors that satisfy {\bf A}{\bf X}_{2}=2{\bf X}_{2} are
{\bf X}_{2}=\pm \frac{1}{\sqrt{2}} \left[\begin{array}{rcccccc} 1\\ 1\\-1 \end{array}\right]. \nonumber
For more on this subject, see Theorem [theorem:4].
[example:6]