Skip to main content
Mathematics LibreTexts

10.1: Showing Linear Dependence

  • Page ID
    2036
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    In the above example we were given the linear combination \(3v_{1}+2v_{2}-v_{3}+v_{4}\) seemingly by magic. The next example shows how to find such a linear combination, if it exists.

    Example \(\PageIndex{1}\):

    Consider the following vectors in \(\Re^{3}\):
    \[
    v_{1}=\begin{pmatrix}0\\0\\1\end{pmatrix},
    \qquad v_{2}=\begin{pmatrix}1\\2\\1\end{pmatrix},
    \qquad v_{3}=\begin{pmatrix}1\\2\\3\end{pmatrix}.
    \]
    Are they linearly independent?

    We need to see whether the system
    \[
    c^{1}v_{1} + c^{2}v_{2}+ c^{3}v_{3}=0
    \]
    has any solutions for \(c^{1}, c^{2}, c^{3}\). We can rewrite this as a homogeneous system by building a matrix whose columns are the vectors \(v_{1}\), \(v_{2}\) and \(v_{3}\):
    \[
    \begin{pmatrix}v_{1}&v_{2}&v_{3}\end{pmatrix}\begin{pmatrix}c^{1}\\c^{2}\\c^{3}\end{pmatrix}=0.
    \]
    This system has solutions if and only if the matrix \(M=\begin{pmatrix}v_{1}&v_{2}&v_{3}\end{pmatrix}\) is singular, so we should find the determinant of \(M\):
    \[
    \det M = \det \begin{pmatrix}
    0 & 1 & 1 \\
    0 & 2 & 2 \\
    1 & 1 & 3 \\
    \end{pmatrix}
    = \det \begin{pmatrix}
    1 & 1 \\
    2 & 2 \\
    \end{pmatrix}
    =0.
    \]

    Therefore nontrivial solutions exist. At this point we know that the vectors are linearly dependent. If we need to, we can find coefficients that demonstrate linear dependence by solving the system of equations:
    \[
    \left(\begin{array}{rrrr}
    0 & 1 & 1 & 0\\
    0 & 2 & 2 & 0\\
    1 & 1 & 3 & 0\\
    \end{array}\right) \sim
    \left(\begin{array}{rrrr}
    1 & 1 & 3 & 0\\
    0 & 1 & 1 & 0\\
    0 & 0 & 0 & 0\\
    \end{array}\right) \sim
    \left(\begin{array}{rrrr}
    1 & 0 & 2 & 0\\
    0 & 1 & 1 & 0\\
    0 & 0 & 0 & 0\\
    \end{array}\right).
    \]
    Then \(c^{3}=c^{3}=:\mu\), \(c^{2}=-\mu\), and \(c^{1}=-2\mu\). Now any choice of \(\mu\) will produce coefficients \(c^{1},c^{2},c^{3}\) that satisfy the linear equation. So we can set \(\mu=1\) and obtain:
    \[
    c^{1}v_{1} + c^{2}v_{2}+ c^{3}v_{3}=0
    \Rightarrow -2v_{1} - v_{2} + v_{3}=0.
    \]

    Theorem (Linear Dependence)

    An ordered set of non-zero vectors \(( v_{1}, \ldots, v_{n} )\) is linearly dependent if and only if one of the vectors \(v_{k}\) is expressible as a linear combination of the preceding vectors.

    Proof
    The theorem is an if and only if statement, so there are two things to show.

    \((i.)\) First, we show that if \(v_{k}=c^{1}v_{1}+\cdots c^{k-1}v_{k-1}\) then the set is linearly dependent.

    This is easy. We just rewrite the assumption:
    \[
    c^{1}v_{1}+\cdots+c^{k-1}v_{k-1}-v_{k} + 0v_{k+1}+\cdots + 0v_{n}=0.
    \]
    This is a vanishing linear combination of the vectors \(\{ v_{1}, \ldots, v_{n} \}\) with not all coefficients equal to zero, so \(\{ v_{1}, \ldots, v_{n} \}\) is a linearly dependent set.

    \((ii.)\) Now, we show that linear dependence implies that there exists \(k\) for which \(v_{k}\) is a linear combination of the vectors \(\{ v_{1}, \ldots, v_{k-1} \}\).

    The assumption says that
    \[
    c^{1}v_{1} + c^{2}v_{2}+ \cdots +c^{n}v_{n}=0.
    \]
    Take \(k\) to be the largest number for which \(c_{k}\) is not equal to zero. So:
    \[
    c^{1}v_{1} + c^{2}v_{2}+ \cdots +c^{k-1}v_{k-1}+c^{k}v_{k}=0.
    \]

    (Note that \(k>1\), since otherwise we would have \(c^{1}v_{1}=0\Rightarrow v_{1}=0\), contradicting the assumption that none of the \(v_{i}\) are the zero vector.)

    As such, we can rearrange the equation:
    \begin{eqnarray*}
    c^{1}v_{1} + c^{2}v_{2} + \cdots +c^{k-1}v_{k-1}&=&-c^{k}v_{k}\\ \Rightarrow\
    -\frac{c^{1}}{c^{k}}v_{1} - \frac{c^{2}}{c^{k}}v_{2} - \cdots -\frac{c^{k-1}}{c^{k}}v_{k-1}&=&v_{k}.
    \end{eqnarray*}

    Therefore we have expressed \(v_{k}\) as a linear combination of the previous vectors, and we are done.

    Example \(\PageIndex{2}\): 

    Consider the vector space \(P_{2}(t)\) of polynomials of degree less than or equal to \(2\). Set:
    \begin{eqnarray*}
    v_{1} &=& 1+t \\
    v_{2} &=& 1+t^{2} \\
    v_{3} &=& t+t^{2} \\
    v_{4} &=& 2+t+t^{2} \\
    v_{5} &=& 1+t+t^{2}. \\
    \end{eqnarray*}
    The set \(\{ v_{1}, \ldots, v_{5} \}\) is linearly dependent, because \(v_{4} = v_{1}+v_{2}\).

    Contributor

    This page titled 10.1: Showing Linear Dependence is shared under a not declared license and was authored, remixed, and/or curated by David Cherney, Tom Denton, & Andrew Waldron.