Skip to main content
Mathematics LibreTexts

2.5: Linear Independence

  • Page ID
    111924
  • This page is a draft and is under active development. 

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)
    \( \def\Span#1{\text{Span}\left\lbrace #1\right\rbrace} \def\vect#1{\mathbf{#1}} \def\ip{\boldsymbol{\cdot}} \def\iff{\Longleftrightarrow} \def\cp{\times} \def\Col{\text{Col}\left(#1\right)} \def\Nul{\text{Nul}\left(#1\right)} \def\Row{\text{Row}\left(#1\right)} \)

    Linear independence

    As we have seen in Section , the multiples of a non-zero vector form a line. We have also seen there that, if we consider vectors of the form \(c_{1}\vect{v}_{1}+c_{2}\vect{v}_{2}\) for some vectors \(\vect{v}_{1},\vect{v}_{2}\) and constants \(c_{1},c_{2}\) we usually get a plane. But sometimes we don't! For example, if \(d\vect{v}_{1}=\vect{v}_{2}\) for some constant \(d\), then all vectors of the given form can be rewritten as \((c_{1}+c_{2}d)\vect{v}_{1}\), so they are all contained in the line through the origin and in the direction of \(\vect{v}_{1}\). Every vector we can make as a linear combination of \(\vect{v}_{1}\) and \(\vect{v}_{2}\) can also be made with \(\vect{v}_{1}\) alone. The vector \(\vect{v}_{2}\) is superfluous. This situation can be seen in Figure \(\PageIndex{1}\).
     
    Fig-LinInd-Examplein1D.svg
    Figure \(\PageIndex{1}\): The set \(\left\{\vect{v}_{1},\vect{v}_{2}\right\}\) contains two vectors, but one of them is superfluous. Every vector one can make as a linear combination of \(\vect{v}_{1}\) and \(\vect{v}_{2}\) can also be made with just \(\vect{v}_{1}\).

    We will now formalise this concept of superfluous vectors.
    Definition \(\PageIndex{1}\)
    We will call a set \(S\) of vectors linearly dependent if there is some \(\vect{v}\) in \(S\) such that \(\Span{S}=\Span{S\setminus\left\{\vect{v}\right\}}\). In this case, we say that \(\vect{v}\) is linearly dependent on \(S\setminus\left\{\vect{v}\right\}\). If \(S\) is not linearly dependent, we say \(S\) is linearly independent.
    In other words, a set \(S\) is linearly dependent if it contains at least one superfluous vector. It may very well contain infinitely many. Let us briefly investigate linear dependence for very small sets before we get to more substantial examples.
    Proposition \(\PageIndex{2}\)
    Let \(S\) be a subset of \(\mathbb{R}^{n}\) containing
    1. precisely one vector, say \(\vect{v}\). Then \(S\) is linearly dependent precisely when \(\vect{v}=\vect{0}\).
    2. precisely two vectors, say \(\vect{u}\) and \(\vect{v}\). Then \(S\) is linearly independent unless one of these vectors is a multiple of the other.
    Skip/Read the proof
    Proof
    1. Assume \(S=\left\{\vect{v}\right\}\). The span of \(S\setminus\left\{\vect{v}\right\}\) is the span of the empty set, which is precisely \(\left\{\vect{0}\right\}\). This is equal to \(\Span{S}\) if and only if \(\vect{v}=\vect{0}\).
    2. If \(\Span{S}=\Span{\vect{v}}\) then \(\vect{u}\) is in \(\Span{\vect{v}}\) so it is a multiple of \(\vect{v}\). Similarly, if \(\Span{S}=\Span{\vect{u}}\) then \(\vect{v}\) is in \(\Span{\vect{u}}\) so it is a multiple of \(\vect{u}\).
    As you can see from the proof of Proposition \(\PageIndex{2}\), our definition of linear depence, while intuitive, is a bit hard to work with. In Corollary \(\PageIndex{6}\), we will see a more convenient way to determine whether a given set of vectors is linearly dependent or not. But let us first consider some examples.
    Example \(\PageIndex{3}\)
    1. Consider the vectors \[ \vect{v}_{1}= \begin{bmatrix}1\\0\end{bmatrix}\quad\vect{v}_{2}= \begin{bmatrix}0\\1\end{bmatrix} \quad\vect{v}_{3}= \begin{bmatrix}1\\1\end{bmatrix},\nonumber\] which are shown on the left in Figure \(\PageIndex{2}\). The set \(S=\left\{\vect{v}_{1},\vect{v}_{2},\vect{v}_{3}\right\}\) is linearly dependent in view of the following equalities: \[ \vect{v}_{1}=-\vect{v}_{2}+\vect{v}_{3}, \label{Eq:LinInd:LinIndEx1} \] \[ \vect{v}_{2}=-\vect{v}_{1}+\vect{v}_{3}, \label{Eq:LinInd:LinIndEx2} \] \[ \vect{v}_{3}=\vect{v}_{1}+\vect{v}_{2}. \label{Eq:LinInd:LinIndEx3} \] Indeed, if we take an arbitrary vector \(\vect{v}\) in \(\Span{S}\), we can write it as \begin{aligned} \vect{v}&=c_{1}\vect{v}_{1}+c_{2}\vect{v}_{2}+c_{3}\vect{v}_{3}\\ &=(c_{2}-c_{1})\vect{v}_{2}+(c_{3}+c_{1})\vect{v}_{3} \end{aligned} in view of equation \eqref{Eq:LinInd:LinIndEx1}. This means that \(\vect{v}\) is also in \(\Span{S\setminus\left\{\vect{v}_{1}\right\}}\) and consequently that \(\vect{v}_{1}\) is linearly dependent on \(\vect{v}_{2}\) and \(\vect{v}_{3}\). Similarly, equation \eqref{Eq:LinInd:LinIndEx2} shows that \(\vect{v}_{2}\) is linearly dependent on \(\vect{v}_{1}\) and \(\vect{v}_{3}\) and equation \eqref{Eq:LinInd:LinIndEx3} shows that \(\vect{v}_{3}\) is linearly dependent on \(\vect{v}_{1}\) and \(\vect{v}_{2}\) . However, every subset of \(S\) containing precisely two vectors will be linearly independent, as \(S\) contains no two vectors that are multiples of each other.
    2. Consider now the vectors \[ \vect{v}_{1}= \begin{bmatrix}1\\0\end{bmatrix}\quad\vect{v}_{2}= \begin{bmatrix}2\\0\end{bmatrix}\quad \vect{v}_{3}= \begin{bmatrix}0\\1\end{bmatrix}\nonumber\] which are shown on the right in Figure \(\PageIndex{2}\). The set \(S=\left\{\vect{v}_{1},\vect{v}_{2},\vect{v}_{3}\right\}\) is again linearly dependent since \[ \vect{v}_{2}=2\vect{v}_{1}+0\vect{v}_{3}\nonumber\nonumber\] but now the subset \(\left\{\vect{v}_{1},\vect{v}_{2}\right\}\) is a linearly dependent subset of \(S\). On the other hand, the subsets \(\left\{\vect{v}_{1},\vect{v}_{3}\right\}\) and \(\left\{\vect{v}_{2},\vect{v}_{3}\right\}\) are linearly independent.
       
      Fig-LinInd-Examplein2D.svg
      Figure \(\PageIndex{2}\): The vectors from i on the left and from ii on the right. On the left, there is no vector which is a multiple of another vector so every set of two vectors is linearly independent. On the right this is not the case. The vectors \(\vect{v}_{1}\) and \(\vect{v}_{2}\) are multiples of each other and therefore \(\left\{\vect{v}_{1},\vect{v}_{2}\right\}\) is linearly dependent.

    3. Put \[ \vect{v}_{1}= \begin{bmatrix}1\\0\\0\end{bmatrix},\quad\vect{v}_{2}= \begin{bmatrix}0\\1\\0\end{bmatrix},\quad\vect{v}_{3}= \begin{bmatrix}1\\2\\0\end{bmatrix},\quad \text{and}\quad\vect{v}_{4}= \begin{bmatrix}1\\2\\1\end{bmatrix}.\nonumber\] The set \(\left\{\vect{v}_{1},\vect{v}_{2},\vect{v}_{3}\right\}\) is linearly dependent. The set \(\left\{\vect{v}_{1},\vect{v}_{2},\vect{v}_{4}\right\}\), however, is not. This is illustrated in Figure \(\PageIndex{3}\).
       
      Fig-LinInd-Examplein3D.svg
      Figure \(\PageIndex{3}\): The four vectors from (iii). Note that \(\vect{v}_{3}\) lies in the plane spanned by \(\vect{v}_{1}\) and \(\vect{v}_{2}\) but \(\vect{v}_{4}\) does not. This means that \(\left\{\vect{v}_{1},\vect{v}_{2},\vect{v}_{3}\right\}\) is linearly dependent but \(\left\{\vect{v}_{1},\vect{v}_{2},\vect{v}_{4}\right\}\) is not.

    The following elementary properties of linear dependence and linear independence will be used throughout the text, often tacitly.
    Proposition \(\PageIndex{4}\)
    Let \(S\) be a subset of \(\mathbb{R}^{n}\).
    1. If \(S\) contains \(\vect{0}\), then it is a linearly dependent set.
    2. If \(S\) is linearly dependent and \(S\subseteq T\), then \(T\) is linearly dependent.
    3. If \(T\) is linearly independent and \(S\subseteq T\), then \(S\) is linearly independent.
    Skip/Read the proof
    Proof
    Exercise.
    But how do you determine whether a set of vectors is linearly independent or not? Like so many problems in linear algebra, it comes down to solving a system of linear equations, as Proposition \(\PageIndex{5}\) shows.
    Proposition \(\PageIndex{5}\)
    A set \(\left\{\vect{v}_{1},...,\vect{v}_{k}\right\}\) of vectors in \(\mathbb{R}^{n}\) is linearly dependent if and only if the vector equation \[ c_{1}\vect{v}_{1}+\cdots +c_{k}\vect{v}_{k}=\vect{0} \label{Eq:LinInd:VecEqisZero} \] has a non-trivial solution.
    Skip/Read the proof
    Proof
    If \(\left\{\vect{v}_{1},...,\vect{v}_{k}\right\}\) is linearly dependent, one of these vectors, say \(\vect{v}_{i}\), is linearly dependent on the others, i.e. it is in \(\Span{\vect{v}_{1},...,\vect{v}_{i-1},\vect{v}_{i+1},...\vect{v}_{k}}\). Therefore, there exist some scalars \(c_{1},...,c_{i-1},c_{i+1},...,c_{k}\) such that \[ \vect{v}_{i}=c_{1}\vect{v}_{1}+\cdots +c_{i-1}\vect{v_{i-1}}+c_{i+1}\vect{v_{i+1}}+\cdots +c_{k}\vect{v}_{k}\nonumber\] or equivalently \[ 0=c_{1}\vect{v}_{1}+\cdots +c_{i-1}\vect{v_{i-1}}-\vect{v}_{i}+c_{i+1}\vect{v_{i+1}}+\cdots +c_{k}\vect{v}_{k}.\nonumber\] This means that \((c_{1},...,c_{i-1},-1,c_{i+1},...,c_{k})\) is a solution of the equation \eqref{Eq:LinInd:VecEqisZero}. It is a non-trivial one since the \(i\)-th coefficient is \(-1\) which is non-zero. If \eqref{Eq:LinInd:VecEqisZero} has a non-trivial solution then there are \(c_{1},...,c_{k}\), not all \(0\), such that \(c_{1}\vect{v}_{1}+\cdots +c_{k}\vect{v}_{k}=\vect{0}\). Take any \(i\) such that \(c_{i}\neq0\). Then \[ \vect{v}_{i}=\frac{c_{1}}{c_{i}}\vect{v}_{1}-\cdots -\frac{c_{i-1}}{c_{i}}\vect{v_{i-1}}-\frac{c_{i+1}}{c_{i}}\vect{v}_{i}-\cdots -\frac{c_{k}}{c_{i}}\vect{v}_{k}.\nonumber\] This implies \(\vect{v}_{i}\) is in \(\Span{\vect{v}_{1},...,\vect{v_{i-1}},\vect{v_{i+1}},...,\vect{v}_{k}}\) so \(\left\{\vect{v}_{1},...,\vect{v}_{k}\right\}\) is linearly dependent.
    Corollary \(\PageIndex{6}\)
    A set \(\left\{\vect{v}_{1},...,\vect{v}_{k}\right\}\) of vectors in \(\mathbb{R}^{n}\) is linearly dependent if and only if the matrix equation \(A\vect{x}=\vect{0}\) with \(A=\left[\vect{v}_{1} \cdots \vect{v}_{k}\right]\) has a non-trivial solution, i.e. if \(A\) has a column without a pivot.
    Skip/Read the proof
    Proof
    Exercise.
    Example \(\PageIndex{7}\)
    Consider the following three vectors in \(\mathbb{R}^{4}\): \[ \vect{v}_{1}= \begin{bmatrix} 1\\1\\0\\-2 \end{bmatrix}\quad\vect{v}_{2}= \begin{bmatrix}-1\\2\\3\\-2\end{bmatrix}\quad\vect{v}_{3}= \begin{bmatrix} 4\\1\\-3\\-4\end{bmatrix}.\nonumber\] Do these vectors form a linearly dependent set? How do we find out? Well, we use the vectors as the columns of a matrix \(A\) and compute an echelon form using standard techniques \[ A= \begin{bmatrix}1&-1&4\\1&2&1\\0&3&-3\\-2&-2&-4 \end{bmatrix}\sim\cdots\sim \begin{bmatrix} 1&-1&4\\0&3&-3\\0&0&0\\0&0&0\end{bmatrix}.\nonumber\] The third column has no pivot, so the system \(A\vect{x}=\vect{0}\) has infinitely many solutions. In particular, it therefore has a non-trivial one. Consequently, the set \(\left\{\vect{v}_{1},\vect{v}_{2},\vect{v}_{3}\right\}\) is linearly dependent. From the reduced echelon form, we can easily find a way to write \(\vect{v}_{3}\) as a linear combination of \(\vect{v}_{1}\) and \(\vect{v}_{2}\). In this case, the reduced echelon form is \[ A= \begin{bmatrix}1&-1&4\\1&2&1\\0&3&-3\\-2&-2&-4 \end{bmatrix}\sim\cdots\sim \begin{bmatrix} 1&0&3\\0&1&-1\\0&0&0\\0&0&0\end{bmatrix}.\nonumber\] If we put the free variable \(x_{3}\) equal to 1, we find \(x_{1}=-3\) and \(x_{2}=1\), which gives: \[ -3\vect{v}_{1}+\vect{v}_{2}+\vect{v}_{3}=\vect{0}\quad\text{hence}\quad \vect{v}_{3}=3\vect{v}_{1}-\vect{v}_{2}.\nonumber\]
    There now follow a couple of statements which can be helpful in determining whether a given set of vectors is linearly dependent or not. The first one tells us that an ordered set is linearly dependent precisely when there is a vector which depends on the preceding vectors.
    Theorem \(\PageIndex{8}\)
    An ordered set \(S=(\vect{v}_{1},...,\vect{v}_{n})\) is linearly dependent if and only if there is a \(k\) such that \(\vect{v}_{k}\) is a linear combination of \(\vect{v}_{1},...,\vect{v_{k-1}}\).
    Skip/Read the proof
    Proof
    Let us assume \(v_{k}=c_{1}\vect{v}_{1}+\cdots+c_{k-1}\vect{v}_{k-1}\) for some scalars \(c_{1},...,c_{k-1}\). An arbitrary element \(\vect{v}\) of \(\Span{S}\) is a linear combination of \(\vect{v}_{1},...,\vect{v}_{n}\), so it is \[ \vect{v}=d_{1}\vect{v}_{1}+\cdots+d_{k-1}\vect{v}_{k-1}+d_{k}\vect{v}_{k}+d_{k+1}\vect{v}_{k+1}+\cdots+ d_{n}\vect{v}_{n}\nonumber\] for certain scalars \(d_{1},...,d_{n}\). We can now rewrite \(\vect{v}\) as \[ \vect{v}=d_{1}\vect{v}_{1}+\cdots+d_{k-1}\vect{v}_{k-1}+d_{k}(c_{1}\vect{v}_{1}+\cdots+c_{k-1}\vect{v}_{k-1})+d_{k+1}\vect{v}_{k+1}+\cdots+ d_{n}\vect{v}_{n}\nonumber\] so \(\vect{v}\) is in \(\Span{S\setminus\left\{\vect{v}_{k}\right\}}\). Suppose now that \(S\) is linearly dependent. Let \(k\) be maximal such that \(\Span{S}=\Span{S\setminus\left\{\vect{v}_{k}\right\}}\). Since \(\vect{v}_{k}\) is in \(S\), it is in \(\Span{S\setminus\left\{\vect{v}_{k}\right\}}\). So there exist scalars \(c_{1},..,c_{k-1},c_{k+1},...,c_{n}\) such that \[ \vect{v}_{k}=c_{1}\vect{v}_{1}+\cdots+c_{k-1}\vect{v}_{k-1}+c_{k+1}\vect{v}_{k+1}+\cdots+ c_{n}\vect{v}_{n}. \label{Eq:LinInd:vkLinCombofOthers} \] If we can show that \(c_{k+1}=...=c_{n}=0\) we are done, because then we have written \(\vect{v}_{k}\) as a linear combination of \(\vect{v}_{1},...,\vect{v}_{k-1}\). We will prove by contraposition that this is impossible. Assume \(c_{j}\neq0\) for some \(j\) greater than \(k\). Then Equation \eqref{Eq:LinInd:vkLinCombofOthers} yields \[ \vect{v}_{j}=\frac{1}{c_{j}}(c_{1}\vect{v}_{1}-\cdots -c_{k-1}\vect{v}_{k-1}+\vect{v}_{k}-c_{k+1}\vect{v}_{k+1}-\cdots-c_{j-1}\vect{v}_{j-1}-c_{j+1}\vect{v}_{j+1}-\cdots -c_{n}\vect{v}_{n}).\nonumber\] Consequently, any linear combination of \(S\) can be rewritten as a linear combination of \(\vect{v}_{1},...,\vect{v}_{j-1},\vect{v}_{j+1},...,\vect{v}_{n}\), i.e. \(\Span{S}=\Span{S\setminus\left\{\vect{v}_{j}\right\}}\). But \(j\) is larger than \(k\) and we have assumed \(k\) to be maximal with this property! This is impossible, so \(c_{j}=0\) for all \(j\) greater than \(k\).
    Theorem \(\PageIndex{8}\) together with Corollary \(\PageIndex{6}\) implies that the columns of a matrix are linearly dependent if and only if there is some column which is linearly dependent on the preceding ones. This observation will allow us to simplify certain arguments. The following result essentially claims that if \(k\) vectors suffice to span a set \(S\) and you have more than \(k\) vectors in \(S\), then at least one of them must be superfluous.
    Theorem \(\PageIndex{9}\)
    Suppose \(\vect{u}_{1},...,\vect{u}_{k}\) and \(\vect{v}_{1},...,\vect{v}_{l}\) are all vectors in \(\mathbb{R}^{n}\). If \(k < l\) and \(\Span{\vect{u}_{1},...,\vect{u}_{k}}\) contains \(\Span{\vect{v}_{1},...,\vect{v}_{l}}\) then the set \(\left\{\vect{v}_{1},...,\vect{v}_{l}\right\}\) is linearly dependent.
    Skip/Read the proof
    Proof
    Consider the matrices \[ A=\left[\vect{u}_{1}\ \cdots\ \vect{u}_{k}\right],\quad B=\left[\vect{v}_{1}\ \cdots\ \vect{v}_{l}\right]\quad \text{and}\quad C=[A\ B].\nonumber\] Bringing \(C\) in echelon form gives \[ C\sim D=[E\ F]\nonumber\] where \(D\) is the echelon form of \(C\), \(E\) is an echelon form of \(A\), and \(F\) is equivalent to \(B\). We claim that all of the pivot positions of \(D\) are in \(E\). Indeed, suppose that the \(i\)-th column of \(F\), let's call it \(f_{i}\), contains a pivot. Then \(E\vect{x}=\vect{f}_{i}\) is inconsistent and therefore \(A\vect{x}=\vect{v}_{i}\) is also inconsistent since the elementary row operations preserve linear combinations. But this implies that \(\vect{v}_{i}\) is not a linear combination of \(\vect{u}_{1},...,\vect{u}_{k}\) hence it is not in \(\Span{\vect{u}_{1},...,\vect{u}_{k}}\). This is a contradiction. Since \(F\) contains no pivot positions of \(D\), it has at least as many zero rows as \(E\). This implies that an echelon form \(G\) of \(B\), which is necessarily also an echelon form of \(F\), must also have at least as many zero rows as \(E\). Therefore, \(G\) has no more pivots than \(E\). Since \(E\) has at most \(k\) pivots and \(k < l\), \(G\) must have a column without pivot. So \(B\vect{x}=\vect{0}\) has a non-trivial solution and by Corollary \(\PageIndex{6}\) the set \(\left\{\vect{v}_{1},...,\vect{v}_{l}\right\}\) must be linearly dependent.
    Corollary \(\PageIndex{10}\)
    Let \(S\) be a subset of \(\mathbb{R}^{n}\). If there are more than \(n\) vectors in \(S\), then \(S\) is linearly dependent.
    Skip/Read the proof
    Proof
    Take distinct vectors \(\vect{v}_{1},...,\vect{v_{n+1}}\) in \(S\). \(\Span{\vect{v}_{1},...,\vect{v_{n+1}}}\) is contained in \(\Span{\vect{e}_{1},..,\vect{e}_{n}}\) and \(n+1 > n\), so \(\left\{\vect{v}_{1},..,\vect{v_{n+1}}\right\}\) is linearly dependent by Theorem \(\PageIndex{9}\). Since this set is contained in \(S\), \(S\) must be linearly dependent, too, by Proposition \(\PageIndex{4}\).
    Example \(\PageIndex{11}\)
    To illustrate the strength of Corollary \(\PageIndex{10}\), consider the following set of vectors in \(\mathbb{R}^{5}\): \[ \left\{ \begin{bmatrix}5\\-2\\3\\1\\0\end{bmatrix}, \begin{bmatrix}-47\\8\\12\\-3\\4\end{bmatrix}, \begin{bmatrix}12\\-3\\-2\\-1\\11\end{bmatrix}, \begin{bmatrix}42\\-7\\-52\\2\\16\end{bmatrix}, \begin{bmatrix}87\\56\\-32\\1\\0\end{bmatrix}, \begin{bmatrix}-48\\2\\35\\156\\8\end{bmatrix}\right\}.\nonumber\] If we had to bring the matrix with these six vectors as columns to echelon form, we would have our work cut out for us! Fortunately, we can just remark that there are six vectors with five entries each. Since \(6 > 5\), Corollary \(\PageIndex{10}\) guarantees that this set is linearly dependent.
    Application \(\PageIndex{12}\)
    Often, on news sites or in newspapers, you might see the standings of a football tournament displayed in a large table, as in Table \(\PageIndex{4}\). Quite a lot of the information in such a table is redundant because some of the columns are linearly dependent.
    Table \(\PageIndex{4}\). The final standings of the first season of the Eredivisie football played in 1956-57, as shown on Wikipedia on Wednesday, March 23rd 2022.
    Team Pld W D L GF GA GD Pts
    1 AFC Ajax 34 22 5 7 64 40 24 49
    2 Fortuna '54 34 20 5 9 76 48 28 45
    3 SC Enschede 34 15 11 8 81 47 34 41
    4 MVV Maastricht 34 15 10 9 53 42 11 40
    5 PSV Eindhoven 34 18 3 13 93 71 22 39
    6 Feijenoord 34 15 9 10 79 58 21 39
    7 VVV '03 34 16 6 12 50 53 -3 38
    8 Sparta Rotterdam 34 12 12 10 66 59 7 36
    9 NAC 34 14 8 12 59 61 -2 36
    10 DOS 34 17 1 16 79 75 4 35
    11 Rapid JC 34 13 7 14 64 63 1 33
    12 NOAD 34 12 7 15 54 64 -10 31
    13 BVC Amsterdam 34 11 8 15 49 67 -18 30
    14 GVAV 34 9 10 15 52 66 -14 28
    15 BVV 34 11 4 19 70 76 -6 26
    16 Elinkwijk 34 10 4 20 52 87 -35 24
    17 Willem II 34 8 6 20 59 79 -20 22
    18 FC Eindhoven 34 8 4 22 39 83 -44 20
    Each of the eight columns on the right can be interpreted as a vector with one entry per team, so a vector in \(\mathbb{R}^{18}\). Using the column headers from Table \(\PageIndex{4}\) as the names of these vectors, we find: \[ \vect{Pts}=2\vect{W}+1\vect{D}+0\vect{L}\nonumber\] since a win yielded 2 points, a draw 1 point, and a loss 0 points. This means that \(\left\{\vect{W},\vect{D},\vect{L},\vect{Pts}\right\}\) is a linearly dependent subset of \(\mathbb{R}^{18}\). In fact, the smaller set \(\left\{\vect{W},\vect{D},\vect{Pts}\right\}\) is already linearly dependent. Similarly, the column \(\vect{GD}\), which gives the goal difference for each team, can be obtained by subtracting the column \(\vect{GA}\), which gives the goals conceded, from \(\vect{GF}\), which gives the goals scored.

    2.5: Linear Independence is shared under a CC BY license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?