# 9.1: Linear Independence

One of the core concepts in linear algebra is *linear independence*, and this concept translates to general vector spaces with no difficulty.

Definition 9.1.0

Let \(S\) be a set with elements \(s_i\). A *linear combination* of elements \(\{s_1, s_2, \ldots, s_n\}\) is given by any **finite** sum \(\sum_{s\in S}c_s s\) with coefficients \(c_s\in k\). (If \(S\) is an infinite set, then all but finitely many \(c_s\) must be equal to \(0\).)

Definition 9.1.1

Let \(S\) be a set of vectors in a vector space \(V\). Then we say that \(S\) is *linearly dependent* if there exists a linear combination of elements of \(S\) equal to \(0\).

Example 9.1.1

Let \(\mathbb{R}^\infty\) be the vector space of sequences of elements of \(\mathbb{R}\). (ie, the space of sequences \(r=(r_1,r_2, r_3, \ldots)\), with coordinate-wise addition and the usual scalar multiplication.) Let \(r_i\in \mathbb{R}^\infty\) be the sequence with \((e_i)_i=1\) and \((e_i)_j=0\) for all \(j\neq i\). Let \(n\) be the element \((-1, -1, -1, \ldots)\). Now, let \(S\) be the set of all the \(e_i\) and \(n\). This is actually a linearly independent set. You might note that the sum of all of the elements in \(S\) (with all coefficients in the sum equal to \(1\)) seems to be the \(0\)-vector. But this is an infinite sum, and is thus not considered a linear combination of elements of \(S\).

### Contributors

- Tom Denton (Fields Institute/York University in Toronto)