9.3: Linear Independence
( \newcommand{\kernel}{\mathrm{null}\,}\)
- Determine if a set is linearly independent.
In this section, we will again explore concepts introduced earlier in terms of Rn and extend them to apply to abstract vector spaces.
Let V be a vector space. If {→v1,⋯,→vn}⊆V, then it is linearly independent if n∑i=1ai→vi=→0impliesa1=⋯=an=0 where the ai are real numbers.
The set of vectors is called linearly dependent if it is not linearly independent.
Let S⊆P2 be a set of polynomials given by S={x2+2x−1,2x2−x+3} Determine if S is linearly independent.
Solution
To determine if this set S is linearly independent, we write a(x2+2x−1)+b(2x2−x+3)=0x2+0x+0 If it is linearly independent, then a=b=0 will be the only solution. We proceed as follows. a(x2+2x−1)+b(2x2−x+3)=0x2+0x+0ax2+2ax−a+2bx2−bx+3b=0x2+0x+0(a+2b)x2+(2a−b)x−a+3b=0x2+0x+0
It follows that a+2b=02a−b=0−a+3b=0
The augmented matrix and resulting reduced row-echelon form are given by [1202−10−130]→⋯→[100010000]
Hence the solution is a=b=0 and the set is linearly independent.
The next example shows us what it means for a set to be dependent.
Determine if the set S given below is independent. S={[−101],[111],[135]}
Solution
To determine if S is linearly independent, we look for solutions to a[−101]+b[111]+c[135]=[000] Notice that this equation has nontrivial solutions, for example a=2, b=3 and c=−1. Therefore S is dependent.
The following is an important result regarding dependent sets.
Let V be a vector space and suppose W={→v1,→v2,⋯,→vk} is a subset of V. Then W is dependent if and only if →vi can be written as a linear combination of {→v1,→v2,⋯,→vi−1,→vi+1,⋯,→vk} for some i≤k.
Revisit Example 9.3.2 with this in mind. Notice that we can write one of the three vectors as a combination of the others. [135]=2[−101]+3[111]
By Lemma 9.3.1 this set is dependent.
If we know that one particular set is linearly independent, we can use this information to determine if a related set is linearly independent. Consider the following example.
Let V be a vector space and suppose S⊆V is a set of linearly independent vectors given by S={→u,→v,→w}. Let R⊆V be given by R={2→u−→w,→w+→v,3→v+12→u}. Show that R is also linearly independent.
Solution
To determine if R is linearly independent, we write a(2→u−→w)+b(→w+→v)+c(3→v+12→u)=→0 If the set is linearly independent, the only solution will be a=b=c=0. We proceed as follows. a(2→u−→w)+b(→w+→v)+c(3→v+12→u)=→02a→u−a→w+b→w+b→v+3c→v+12c→u=→0(2a+12c)→u+(b+3c)→v+(−a+b)→w=→0
We know that the set S={→u,→v,→w} is linearly independent, which implies that the coefficients in the last line of this equation must all equal 0. In other words: 2a+12c=0b+3c=0−a+b=0
The augmented matrix and resulting reduced row-echelon form are given by: [201200130−1100]→⋯→[100001000010] Hence the solution is a=b=c=0 and the set is linearly independent.
The following theorem was discussed in terms in Rn. We consider it here in the general case.
Let V be a vector space and let U={→v1,⋯,→vk}⊆V be an independent set. If →v∈spanU, then →v can be written uniquely as a linear combination of the vectors in U.
Consider the span of a linearly independent set of vectors. Suppose we take a vector which is not in this span and add it to the set. The following lemma claims that the resulting set is still linearly independent.
Suppose →v∉span{→u1,⋯,→uk} and {→u1,⋯,→uk} is linearly independent. Then the set {→u1,⋯,→uk,→v} is also linearly independent.
- Proof
-
Suppose ∑ki=1ci→ui+d→v=→0. It is required to verify that each ci=0 and that d=0. But if d≠0, then you can solve for →v as a linear combination of the vectors, {→u1,⋯,→uk}, →v=−k∑i=1(cid)→ui contrary to the assumption that →v is not in the span of the →ui. Therefore, d=0. But then ∑ki=1ci→ui=→0 and the linear independence of {→u1,⋯,→uk} implies each ci=0 also.
Consider the following example.
Let S⊆M22 be a linearly independent set given by S={[1000],[0100]} Show that the set R⊆M22 given by R={[1000],[0100],[0010]} is also linearly independent.
Solution
Instead of writing a linear combination of the matrices which equals 0 and showing that the coefficients must equal 0, we can instead use Lemma 9.3.2.
To do so, we show that [0010]∉span{[1000],[0100]}
Write [0010]=a[1000]+b[0100]=[a000]+[0b00]=[ab00]
Clearly there are no possible a,b to make this equation true. Hence the new matrix does not lie in the span of the matrices in S. By Lemma 9.3.2, R is also linearly independent.