Skip to main content
Mathematics LibreTexts

2.4: Linear independence

  • Page ID
    82483
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    In the previous section, we studied our question concerning the existence of solutions to a linear system by inquiring about the span of a set of vectors. In particular, the span of a set of vectors \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is the set of vectors \(\mathbf b\) for which a solution to the linear system \(\left[\begin{array}{rrrr} \mathbf v_1& \mathbf v_2& \ldots& \mathbf v_n \end{array}\right] \mathbf x = \mathbf b\) exists.

    In this section, our focus turns to the uniqueness of solutions of a linear system, the second of our two fundamental questions asked in Question 1.4.2. This will lead us to the concept of linear independence.

    Linear dependence

    In the previous section, we looked at some examples of the span of sets of vectors in \(\mathbb R^3\text{.}\) We saw one example in which the span of three vectors formed a plane. In another, the span of three vectors formed \(\mathbb R^3\text{.}\) We would like to understand the difference in these two examples.

    Preview Activity 2.4.1.

    Let's start this activity by studying the span of two different sets of vectors.

    1. Consider the following vectors in \(\mathbb R^3\text{:}\)
      \begin{equation*} \mathbf v_1=\threevec{0}{-1}{2}, \mathbf v_2=\threevec{3}{1}{-1}, \mathbf v_3=\threevec{2}{0}{1}\text{.} \end{equation*}

      Describe the span of these vectors, \(\laspan{\mathbf v_1,\mathbf v_2,\mathbf v_3}\text{.}\)

    2. We will now consider a set of vectors that looks very much like the first set:
      \begin{equation*} \mathbf w_1=\threevec{0}{-1}{2}, \mathbf w_2=\threevec{3}{1}{-1}, \mathbf w_3=\threevec{3}{0}{1}\text{.} \end{equation*}

      Describe the span of these vectors, \(\laspan{\mathbf w_1,\mathbf w_2,\mathbf w_3}\text{.}\)

    3. Show that the vector \(\mathbf w_3\) is a linear combination of \(\mathbf w_1\) and \(\mathbf w_2\) by finding weights such that
      \begin{equation*} \mathbf w_3 = a\mathbf w_1 + b\mathbf w_2\text{.} \end{equation*}
    4. Explain why any linear combination of \(\mathbf w_1\text{,}\) \(\mathbf w_2\text{,}\) and \(\mathbf w_3\text{,}\)
      \begin{equation*} c_1\mathbf w_1 + c_2\mathbf w_2 + c_3\mathbf w_3 \end{equation*}

      can be written as a linear combination of \(\mathbf w_1\) and \(\mathbf w_2\text{.}\)

    5. Explain why
      \begin{equation*} \laspan{\mathbf w_1,\mathbf w_2,\mathbf w_3} = \laspan{\mathbf w_1,\mathbf w_2}\text{.} \end{equation*}

    The preview activity presents us with two similar examples that demonstrate quite different behaviors. In the first example, the vectors \(\mathbf v_1\text{,}\) \(\mathbf v_2\text{,}\) and \(\mathbf v_3\) span \(\mathbb R^3\text{,}\) which we recognize because the matrix \(\left[\begin{array}{rrr}\mathbf v_1& \mathbf v_2& \mathbf v_3 \end{array}\right]\) has a pivot position in every row so that Proposition 2.3.5 applies.

    However, the second example is very different. In this case, the matrix \(\left[\begin{array}{rrr} \mathbf w_1& \mathbf w_2& \mathbf w_3 \end{array}\right]\) has only two pivot positions:

    \begin{equation*} \left[\begin{array}{rrr} \mathbf w_1 & \mathbf w_2 & \mathbf w_3 \end{array}\right] = \left[\begin{array}{rrr} 0 & 3 & 3 \\ -1 & 1 & 0 \\ 2 & -1 & 1 \end{array}\right] \sim \left[\begin{array}{rrr} 1 & 0 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 0 \end{array}\right]\text{.} \end{equation*}

    Let's look at this matrix and change our perspective slightly by considering it to be an augmented matrix.

    \begin{equation*} \left[\begin{array}{rr|r} \mathbf w_1 & \mathbf w_2 & \mathbf w_3 \end{array}\right] = \left[\begin{array}{rr|r} 0 & 3 & 3 \\ -1 & 1 & 0 \\ 2 & -1 & 1 \end{array}\right] \sim \left[\begin{array}{rr|r} 1 & 0 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 0 \end{array}\right] \end{equation*}

    By doing so, we seek to express \(\mathbf w_3\) as a linear combination of \(\mathbf w_1\) and \(\mathbf w_2\text{.}\) In fact, the reduced row echelon form shows us that

    \begin{equation*} \mathbf w_3 = \mathbf w_1 + \mathbf w_2\text{.} \end{equation*}

    Consequently, we can rewrite any linear cominbation of \(\mathbf w_1\text{,}\) \(\mathbf w_2\text{,}\) and \(\mathbf w_3\) so that

    \begin{equation*} \begin{aligned} c_1\mathbf w_1 + c_2\mathbf w_2 + c_3\mathbf w_3 & {}={} c_1\mathbf w_1 + c_2\mathbf w_2 + c_3(\mathbf w_1+\mathbf w_2) \\ & {}={} (c_1+c_3)\mathbf w_1 + (c_2+c_3)\mathbf w_2 \\ \end{aligned}\text{.} \end{equation*}

    In other words, any linear combination of \(\mathbf w_1\text{,}\) \(\mathbf w_2\text{,}\) and \(\mathbf w_3\) may be written as a linear combination using only the vectors \(\mathbf w_1\) and \(\mathbf w_2\text{.}\) Since the span of a set of vectors is simply the set of their linear combinations, this shows that

    \begin{equation*} \laspan{\mathbf w_1,\mathbf w_2,\mathbf w_3} = \laspan{\mathbf w_1,\mathbf w_2}\text{.} \end{equation*}

    In other words, adding the vector \(\mathbf w_3\) to the set of vectors \(\mathbf w_1\) and \(\mathbf w_2\) does not change the span.

    Before exploring this type of behavior more generally, let's think about this from a geometric point of view. In the first example, we begin with two vectors \(\mathbf v_1\) and \(\mathbf v_2\) and add a third vector \(\mathbf v_3\text{.}\)

    Because the vector \(\mathbf v_3\) is not a linear combination of \(\mathbf v_1\) and \(\mathbf v_2\text{,}\) it provides a direction to move that, when creating linear combinations, is independent of \(\mathbf v_1\) and \(\mathbf v_2\text{.}\) The three vectors therefore span \(\mathbb R^3\text{.}\)

    In the second example, however, the third vector \(\mathbf w_3\) is a linear combination of \(\mathbf w_1\) and \(\mathbf w_2\) so it already lies in the plane formed by these two vectors.

    Since we can already move in this direction with just the first two vectors \(\mathbf w_1\) and \(\mathbf w_2\text{,}\) adding \(\mathbf w_3\) to the set does not enlarge the span. It remains a plane.

    With these examples in mind, we will make the following definition.

    Definition 2.4.1

    A set of vectors \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is called linearly dependent if one of the vectors is a linear combination of the others. Otherwise, the set of vectors is called linearly independent.

    For the sake of completeness, we say that a set of vectors containing only one vector is linearly independent if that vector is not the zero vector, \(\zerovec\text{.}\)

    How to recognize linear dependence

    Activity 2.4.2.

    We would like to develop a means of detecting when a set of vectors is linearly dependent. These questions will point the way.

    1. Suppose we have five vectors in \(\mathbb R^4\) that form the columns of a matrix having reduced row echelon form
      \begin{equation*} \left[\begin{array}{rrrrr} \mathbf v_1 & \mathbf v_2 & \mathbf v_3 & \mathbf v_4 & \mathbf v_5 \end{array}\right] \sim \left[\begin{array}{rrrrr} 1 & 0 & -1 & 0 & 2 \\ 0 & 1 & 2 & 0 & 3 \\ 0 & 0 & 0 & 1 & -1 \\ 0 & 0 & 0 & 0 & 0 \\ \end{array}\right]\text{.} \end{equation*}

      Is it possible to write one of the vectors \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_5\) as a linear combination of the others? If so, show explicitly how one vector appears as a linear combination of some of the other vectors. Is this set of vectors linearly dependent or independent?

    2. Suppose we have another set of three vectors in \(\mathbb R^4\) that form the columns of a matrix having reduced row echelon form
      \begin{equation*} \left[\begin{array}{rrr} \mathbf w_1 & \mathbf w_2 & \mathbf w_3 \\ \end{array}\right] \sim \left[\begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \\ \end{array}\right]\text{.} \end{equation*}

      Is it possible to write one of these vectors \(\mathbf w_1\text{,}\) \(\mathbf w_2\text{,}\) \(\mathbf w_3\) as a linear combination of the others? If so, show explicitly how one vector appears as a linear combination of some of the other vectors. Is this set of vectors linearly dependent or independent?

    3. By looking at the pivot positions, how can you determine whether the columns of a matrix are linearly dependent or independent?
    4. If one vector in a set is the zero vector \(\zerovec\text{,}\) can the set of vectors be linearly independent?
    5. Suppose a set of vectors in \(\mathbb R^{10}\) has twelve vectors. Is it possible for this set to be linearly independent?

    By now, it shouldn't be too surprising that the pivot positions play an important role in determining whether the columns of a matrix are linearly dependent. Let's discuss the previous activity to make this clear.

    • Let's consider the first example from the activity in which we have vectors in \(\mathbb R^4\) such that
      \begin{equation*} \left[\begin{array}{rrrrr} \mathbf v_1 & \mathbf v_2 & \mathbf v_3 & \mathbf v_4 & \mathbf v_5 \end{array}\right] \sim \left[\begin{array}{rrrrr} 1 & 0 & -1 & 0 & 2 \\ 0 & 1 & 2 & 0 & 3 \\ 0 & 0 & 0 & 1 & -1 \\ 0 & 0 & 0 & 0 & 0 \\ \end{array}\right]\text{.} \end{equation*}

      Notice that the third column does not contain a pivot position. Let's focus on the first three columns and consider them as an augmented matrix:

      \begin{equation*} \left[\begin{array}{rr|r} \mathbf v_1 & \mathbf v_2 & \mathbf v_3 \end{array}\right] \sim \left[\begin{array}{rr|r} 1 & 0 & -1 \\ 0 & 1 & 2 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \\ \end{array}\right]\text{.} \end{equation*}

      There is not a pivot in the rightmost column so we know that \(\mathbf v_3\) can be written as a linear combination of \(\mathbf v_1\) and \(\mathbf v_2\text{.}\) In fact, we can read the weights from the augmented matrix:

      \begin{equation*} \mathbf v_3 = -\mathbf v_1 + 2\mathbf v_2\text{.} \end{equation*}

      This says that the set of vectors \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_5\) is linearly dependent.

      This points to the general observation that a set of vectors is linearly dependent if the matrix they form has a column without a pivot.

      In addition, the fifth column of this matrix does not contain a pivot meaning that \(\mathbf v_5\) can be written as a linear combination:

      \begin{equation*} \mathbf v_5 = 2\mathbf v_1 + 3\mathbf v_2 -\mathbf v_4\text{.} \end{equation*}

      This example shows that vectors in columns that do not contain a pivot may be expressed as a linear combination of the vectors in columns that do contain pivots. In this case, we have

      \begin{equation*} \laspan{\mathbf v_1,\mathbf v_2,\mathbf v_3,\mathbf v_4,\mathbf v_5} =\laspan{\mathbf v_1,\mathbf v_2,\mathbf v_4}\text{.} \end{equation*}
    • Conversely, the second set of vectors we studied produces a matrix with a pivot in every column.
      \begin{equation*} \left[\begin{array}{rrr} \mathbf w_1 & \mathbf w_2 & \mathbf w_3 \\ \end{array}\right] \sim \left[\begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \\ \end{array}\right]\text{.} \end{equation*}

      If we interpret this as an augmented matrix again, we see that the linear system is inconsistent since there is a pivot in the rightmost column. This means that \(\mathbf w_3\) cannot be expressed as a linear combination of \(\mathbf w_1\) and \(\mathbf w_2\text{.}\)

      Similarly, \(\mathbf w_2\) cannot be expressed as a linear combination of \(\mathbf w_1\text{.}\) In addition, if \(\mathbf w_2\) could be expressed as a linear combination of \(\mathbf w_1\) and \(\mathbf w_3\text{,}\) we could rearrange that expression to write \(\mathbf w_3\) as a linear combination of \(\mathbf w_1\) and \(\mathbf w_2\text{,}\) which we know is impossible.

      We can therefore conclude that \(\mathbf w_1\text{,}\) \(\mathbf w_2\text{,}\) and \(\mathbf w_3\) form a linearly indpendent set of vectors.

    This leads to the following proposition.

    Proposition 2.4.2.

    The columns of a matrix are linearly independent if and only if every column contains a pivot position.

    This condition imposes a constraint on how many vectors we can have in a linearly independent set. Here is an example of the reduced row echelon form of a matrix having linearly independent columns. Notice that there are three vectors in \(\mathbb R^5\) so there are at least as many rows as columns.

    \begin{equation*} \left[\begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \\ \end{array}\right]\text{.} \end{equation*}

    More generally, suppose that \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is a linearly independent set of vectors in \(\mathbb R^m\text{.}\) When these vectors form the columns of a matrix, there must be a pivot position in every column of the matrix. Since every row contains at most one pivot position, the number of columns can be no greater than the number of rows. This means that the number of vectors in a linearly independent set can be no greater than the number of dimensions.

    Proposition 2.4.3.

    A linearly independent set of vectors in \(\mathbb R^m\) can contain no more than \(m\) vectors.

    This says, for instance, that any linearly independent set of vectors in \(\mathbb R^3\) can contain no more three vectors. Once again, this makes intuitive sense. We usually imagine three independent directions, such as up/down, front/back, left/right, in our three-dimensional world. This proposition tells us that there can be no more independent directions.

    The homogeneous equation

    Given an \(m\times n\) matrix \(A\text{,}\) we call the equation \(A\mathbf x = \zerovec\) a homogenous equation. The solutions to this equation reflect whether the columns of \(A\) are linearly independent or not.

    Activity 2.4.3. Linear independence and homogeneous equations.

    1. Explain why the homogenous equation \(A\mathbf x = \zerovec\) is consistent no matter the matrix \(A\text{.}\)
    2. Consider the matrix
      \begin{equation*} A = \left[\begin{array}{rrr} 3 & 2 & 0 \\ -1 & 0 & -2 \\ 2 & 1 & 1 \end{array}\right] \end{equation*}

      whose columns we denote by \(\mathbf v_1\text{,}\) \(\mathbf v_2\text{,}\) and \(\mathbf v_3\text{.}\) Are the vectors \(\mathbf v_1\text{,}\) \(\mathbf v_2\text{,}\) and \(\mathbf v_3\) linearly dependent or independent?

    3. Give a description of the solution space of the homogeneous equation \(A\mathbf x = \zerovec\text{.}\)
    4. We know that \(\zerovec\) is a solution to the homogeneous equation. Find another solution that is different from \(\zerovec\text{.}\) Use your solution to find weights \(c_1\text{,}\) \(c_2\text{,}\) and \(c_3\) such that
      \begin{equation*} c_1\mathbf v_1 + c_2\mathbf v_2 + c_3\mathbf v_3 = \zerovec\text{.} \end{equation*}
    5. Use the expression you found in the previous part to write one of the vectors as a linear combination of the others.

    For any matrix \(A\text{,}\) we know that the equation \(A\mathbf x = \zerovec\) has at least one solution, namely, the vector \(\mathbf x = \zerovec\text{.}\) We call this the trivial solution to the homogeneous equation so that any other solution that exists is a \(nontrivial\) solution.

    If there is no nontrivial solution, then \(A\mathbf x = \zerovec\) has exactly one solution. There can be no free variables in a description of the solution space so \(A\) must have a pivot position in every column. In this case, the columns of \(A\) must be linearly independent.

    If, however, there is a nontrivial solution, then there are infinitely many solutions so \(A\) must have a column without a pivot position. Hence, the columns of \(A\) are linearly dependent.

    Example 2.4.4

    We will make the connection between solutions to the homogeneous equation and the linear independence of the columns more explict by looking at an example. In particular, we will demonstrate how a nontrivial solution to the homogeneous equation shows that one column of \(A\) is a linear combination of the others. With the matrix \(A\) in the previous activity, the homogeneous equation has the reduced row echelon form

    \begin{equation*} \left[\begin{array}{rrr|r} 3 & 2 & 0 & 0 \\ -1 & 0 & -2 & 0 \\ 2 & 1 & 1 & 0 \\ \end{array}\right] \sim \left[\begin{array}{rrr|r} 1 & 0 & 2 & 0 \\ 0 & 1 & -3 & 0 \\ 0 & 0 & 0 & 0 \\ \end{array}\right]\text{,} \end{equation*}

    which implies that

    \begin{equation*} \begin{alignedat}{4} x_1 & & & {}+{} & 2x_3 & {}={} & 0 \\ & & x_2 & {}-{} & 3x_3 & {}={} & 0 \\ \end{alignedat}\text{.} \end{equation*}

    In terms of the free variable \(x_3\text{,}\) we have

    \begin{equation*} \begin{aligned} x_1 & {}={} -2x_3 \\ x_2 & {}={} 3x_3 \\ \end{aligned}\text{.} \end{equation*}

    Any choice for a value of the free variable \(x_3\) produces a solution so let's choose, for convenience, \(x_3=1\text{.}\) We then have \((x_1,x_2,x_3) = (-2,3,1)\text{.}\)

    Since \((-2,3,1)\) is a solution to the homogeneous equation \(A\mathbf x=\zerovec\text{,}\) this solution gives weights for a linear combination of the columns of \(A\) that create \(\zerovec\text{.}\) That is,

    \begin{equation*} -2\mathbf v_1 + 3\mathbf v_2 + \mathbf v_3 = \zerovec\text{,} \end{equation*}

    which we rewrite as

    \begin{equation*} \mathbf v_3 = 2\mathbf v_1 - 3\mathbf v_2\text{.} \end{equation*}

    As this example demonstrates, there are many ways we can view the question of linear independence. We will record some of these ways in the following proposition.

    Proposition 2.4.5.

    For a matrix \(A = \left[\begin{array}{rrrr} \mathbf v_1& \mathbf v_2& \ldots& \mathbf v_n \end{array}\right] \text{,}\) the following statements are equivalent:

    • The columns of \(A\) are linearly dependent.
    • One of the vectors in the set \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is a linear combination of the others.
    • The matrix \(A\) has a column without a pivot position.
    • The homogeneous equation \(A\mathbf x = \zerovec\) has a nontrivial solution.
    • There are weights \(c_1,c_2,\ldots,c_n\text{,}\) not all of which are zero, such that
      \begin{equation*} c_1\mathbf v_1 + c_2\mathbf v_2 + \ldots + c_n\mathbf v_n = \zerovec\text{.} \end{equation*}

    Summary

    In this section, we developed the concept of linear dependence of a set of vectors. At the beginning of the section, we said that this concept addressed the second of our fundamental questions, expressed in Question 1.4.2, concerning the uniqueness of solutions to a linear system. It is worth comparing the results of this section with those of the previous one so that the parallels between them become clear.

    As is usual, we will write a matrix as a collection of vectors,

    \begin{equation*} A=\left[\begin{array}{rrrr} \mathbf v_1& \mathbf v_2 & \ldots & \mathbf v_n \end{array}\right]. \end{equation*}
    Existence

    In the previous section, we asked if we could write a vector \(\mathbf b\) as a linear combination of the columns of \(A\text{,}\) which happens precisely when a solution to the equation \(A\mathbf x = \mathbf b\) exists. We saw that every vector \(\mathbf b\) could be expressed as a linear combination of the columns of \(A\) when \(A\) has a pivot position in every row. In this case, we said that the span of the vectors \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is \(\mathbb R^m\text{.}\) We saw that at least \(m\) vectors are needed to span \(\mathbb R^m\text{.}\)

    Uniqueness

    In this section, we studied the uniqueness of solutions to the equation \(A\mathbf x = \zerovec\text{,}\) which is always consistent. When a nontrivial solution exists, we saw that one vector of the set \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is a linear combination of the others, in which case we said that the set of vectors is linearly dependent. This happens when the matrix \(A\) has a column without a pivot position. We saw that there can be no more than \(m\) vectors in a set of linearly independent vectors in \(\mathbb R^m\text{.}\)

    To summarize the specific results of this section, we saw that:

    • A set of vectors is linearly dependent if one of the vectors is a linear combination of the others.
    • A set of vectors is linearly independent if and only if the vectors form a matrix that has a pivot position in every column.
    • A set of linearly independent vectors in \(\mathbb R^m\) contains no more than \(m\) vectors.
    • The columns of the matrix \(A\) are linearly dependent if the homogeneous equation \(A\mathbf x = \zerovec\) has a nontrivial solution.
    • A set of vectors \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is linearly dependent if there are weights \(c_1,c_2,\ldots,c_n\text{,}\) not all of which are zero, such that
      \begin{equation*} c_1\mathbf v_1 + c_2\mathbf v_2 + \ldots + c_n\mathbf v_n = \zerovec\text{.} \end{equation*}

    Exercises 2.4.5Exercises

    1

    Consider the set of vectors

    \begin{equation*} \mathbf v_1 = \threevec{1}{2}{1}, \mathbf v_2 = \threevec{0}{1}{3}, \mathbf v_3 = \threevec{2}{3}{-1}, \mathbf v_4 = \threevec{-2}{4}{-1}\text{.} \end{equation*}
    1. Explain why this set of vectors is linearly dependent.
    2. Write one of the vectors as a linear combination of the others.
    3. Find weights \(c_1\text{,}\) \(c_2\text{,}\) \(c_3\text{,}\) and \(c_4\text{,}\) not all of which are zero, such that
      \begin{equation*} c_1\mathbf v_1 + c_2 \mathbf v_2 + c_3 \mathbf v_3 + c_4 \mathbf v_4 = \zerovec\text{.} \end{equation*}
    4. Find a nontrivial solution to the homogenous equation \(A\mathbf x = \zerovec\) where \(A=\left[\begin{array}{rrrr} \mathbf v_1& \mathbf v_2& \mathbf v_3& \mathbf v_4 \end{array}\right]\text{.}\)
    2

    Consider the vectors

    \begin{equation*} \mathbf v_1 = \threevec{2}{-1}{0}, \mathbf v_2 = \threevec{1}{2}{1}, \mathbf v_3 = \threevec{2}{-2}{3}\text{.} \end{equation*}
    1. Are these vectors linearly independent or linearly dependent?
    2. Describe the \(\laspan{\mathbf v_1,\mathbf v_2,\mathbf v_3}\text{.}\)
    3. Suppose that \(\mathbf b\) is a vector in \(\mathbb R^3\text{.}\) Explain why we can guarantee that \(\mathbf b\) may be written as a linear combination of \(\mathbf v_1\text{,}\) \(\mathbf v_2\text{,}\) and \(\mathbf v_3\text{.}\)
    4. Suppose that \(\mathbf b\) is a vector in \(\mathbb R^3\text{.}\) In how many ways can \(\mathbf b\) be written as a linear combination of \(\mathbf v_1\text{,}\) \(\mathbf v_2\text{,}\) and \(\mathbf v_3\text{?}\)
    3

    Answer the following questions and provide a justification for your responses.

    1. If the vectors \(\mathbf v_1\) and \(\mathbf v_2\) form a linearly dependent set, must one vector be a scalar multiple of the other?
    2. Suppose that \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is a linearly independent set of vectors. What can you say about the linear independence or dependence of a subset of these vectors?
    3. Suppose \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is a linearly independent set of vectors that form the columns of a matrix \(A\text{.}\) If the equation \(A\mathbf x = \mathbf b\) is inconsistent, what can you say about the linear independence or dependence of the set of vectors \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n,\mathbf b\text{?}\)
    4

    Determine if the following statements are true or false and provide a justification for your response.

    1. If \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) are linearly dependent, then one vector is a scalar multiple of one of the others.
    2. If \(\mathbf v_1, \mathbf v_2, \ldots, \mathbf v_{10}\) are vectors in \(\mathbb R^5\text{,}\) then the set of vectors is linearly dependent.
    3. If \(\mathbf v_1, \mathbf v_2, \ldots, \mathbf v_{5}\) are vectors in \(\mathbb R^{10}\text{,}\) then the set of vectors is linearly independent.
    4. Suppose we have a set of vectors \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) and that \(\mathbf v_2\) is a scalar multiple of \(\mathbf v_1\text{.}\) Then the set is linearly dependent.
    5. Suppose that \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) are linearly independent and form the columns of a matrix \(A\text{.}\) If \(A\mathbf x = \mathbf b\) is consistent, then there is exactly one solution.
    5

    Suppose we have a set of vectors \(\mathbf v_1,\mathbf v_2,\mathbf v_3,\mathbf v_4\) in \(\mathbb R^5\) that satisfy the relationship:

    \begin{equation*} 2\mathbf v_1 - \mathbf v_2 + 3\mathbf v_3 + \mathbf v_4 = \zerovec \end{equation*}

    and suppose that \(A\) is the matrix \(A=\left[\begin{array}{rrrr} \mathbf v_1& \mathbf v_2& \mathbf v_3& \mathbf v_4 \end{array}\right] \text{.}\)

    1. Find a nontrivial solution to the equation \(A\mathbf x = \zerovec\text{.}\)
    2. Explain why the matrix \(A\) has a column without a pivot position.
    3. Write one of the vectors as a linear combination of the others.
    4. Explain why the set of vectors is linearly dependent.
    6

    Suppose that \(\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_n\) is a set of vectors in \(\mathbb R^{27}\) that form the columns of a matrix \(A\text{.}\)

    1. Suppose that the vectors span \(\mathbb R^{27}\text{.}\) What can you say about the number of vectors \(n\) in this set?
    2. Suppose instead that the vectors are linearly independent. What can you say about the number of vectors \(n\) in this set?
    3. Suppose that the vectors are both linearly independent and span \(\mathbb R^{27}\text{.}\) What can you say about the number of vectors in the set?
    4. Assume that the vectors are both linearly independent and span \(\mathbb R^{27}\text{.}\) Given a vector \(\mathbf b\) in \(\mathbb R^{27}\text{,}\) what can you say about the solution space to the equation \(A\mathbf x = \mathbf b\text{?}\)
    7

    Given below are some descriptions of sets of vectors that form the columns of a matrix \(A\text{.}\) For each description, give a possible reduced row echelon form for \(A\) or indicate why there is no set of vectors satisfying the description by stating why the required reduced row echelon matrix cannot exist.

    1. A set of 4 linearly independent vectors in \(\mathbb R^5\text{.}\)
    2. A set of 4 linearly independent vectors in \(\mathbb R^4\text{.}\)
    3. A set of 3 vectors that span \(\mathbb R^4\text{.}\)
    4. A set of 5 linearly independent vectors in \(\mathbb R^3\text{.}\)
    5. A set of 5 vectors that span \(\mathbb R^4\text{.}\)
    8

    When we explored matrix multiplication in Section 2.2, we saw that some properties that are true for real numbers are not true for matrices. This exercise will investigate that in some more depth.

    1. Suppose that \(A\) and \(B\) are two matrices and that \(AB = 0\text{.}\) If \(B \neq 0\text{,}\) what can you say about the linear independence of the columns of \(A\text{?}\)
    2. Suppose that we have matrices \(A\text{,}\) \(B\) and \(C\) such that \(AB = AC\text{.}\) We have seen that we cannot generally conclude that \(B=C\text{.}\) If we assume additionally that \(A\) is a matrix whose columns are linearly independent, explain why \(B = C\text{.}\) You may wish to begin by rewriting the equation \(AB = AC\) as \(AB-AC = A(B-C) = 0\text{.}\)
    9

    Suppose that \(k\) is an unknown parameter and consider the set of vectors

    \begin{equation*} \mathbf v_1 = \threevec{2}{0}{1}, \mathbf v_2 = \threevec{4}{-2}{-1}, \mathbf v_1 = \threevec{0}{2}{k}\text{.} \end{equation*}
    1. For what values of \(k\) is the set of vectors linearly dependent?
    2. For what values of \(k\) does the set of vectors span \(\mathbb R^3\text{?}\)
    10

    Given a set of linearly dependent vectors, we can eliminate some of the vectors to create a smaller, linearly independent set of vectors.

    1. Suppose that \(\mathbf w\) is a linear combination of the vectors \(\mathbf v_1\) and \(\mathbf v_2\text{.}\) Explain why \(\laspan{\mathbf v_1,\mathbf v_2, \mathbf w} = \laspan{\mathbf v_1,\mathbf v_2}\text{.}\)
    2. Consider the vectors
      \begin{equation*} \mathbf v_1 = \threevec{2}{-1}{0}, \mathbf v_2 = \threevec{1}{2}{1}, \mathbf v_3 = \threevec{-2}{6}{2}, \mathbf v_4 = \threevec{7}{-1}{1}\text{.} \end{equation*}

      Write one of the vectors as a linear combination of the others. Find a set of three vectors whose span is the same as \(\laspan{\mathbf v_1,\mathbf v_2,\mathbf v_3,\mathbf v_4}\text{.}\)

    3. Are the three vectors you are left with linearly independent? If not, express one of the vectors as a linear combination of the others and find a set of two vectors whose span is the same as \(\laspan{\mathbf v_1,\mathbf v_2,\mathbf v_3,\mathbf v_4}\text{.}\)
    4. Give a geometric description of \(\laspan{\mathbf v_1,\mathbf v_2,\mathbf v_3,\mathbf v_4}\) in \(\mathbb R^3\) as we did in Section 2.3.

    This page titled 2.4: Linear independence is shared under a CC BY 4.0 license and was authored, remixed, and/or curated by David Austin via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.

    • Was this article helpful?