$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

$$\newcommand{\vecs}{\overset { \rightharpoonup} {\mathbf{#1}} }$$

$$\newcommand{\vecd}{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$

We've now seen numerous examples of algebraic structures, which we can think of as sets with some operations which satisfy some axioms. Here's a partial list:

1. Groups,
2. Commutative groups,
3. Group actions,
4. Rings,
5. Commutative rings,
6. Integral domains,
7. Fields,
8. and others...

In this chapter, we'll examine vector spaces as algebraic structures. Vector spaces are massively important because these are algebraic structures where the tools of linear algebra are available. Linear algebra is, in some ways, the branch of mathematics which is best developed: when a problem in science is converted into a linear algebra problem, we have a pretty good chance of being able to solve it. This is why, for example, the technique of linearization which comes up in differential equations and modeling is so important.

In fact, viewing vector spaces as algebraic structures does two things for us.

1. This viewpoint helps us identify more situations as linear algebra situations, allowing us to use our linear algebra tools in a broader set of circumstances, and
2. Abstracting allows us to better identify precisely what tools we are using when we prove statements in linear algebra, so we can identify exactly which situations those tools are applicable in. As with rings, there are more than one kind of vector space, and some vector spaces are more 'friendly' than others.

So let's see the definition.

Definition 9.0.0: vector space properties

A vector space is a set $$V$$ and a field $$k$$ with two operations, addition $$+:V\times V \rightarrow V$$ and scalar multiplication $$\cdot: k\times V\rightarrow V$$, satisfying the following axioms.

1. $$V$$ under addition is a commutative group.
2. (Distributivity I) For any $$c\in k$$ and $$v, w\in V$$, we have $$c(v+w)=cv+cw$$.
3. (Distributivity II) For any $$c, d\in k$$ and $$v\in V$$, we have $$(c+d)v=cv+dv$$.
4. (Associativity) For any $$c, d\in k$$ and $$v\in V$$, we have $$(cd)v=c(dv)$$.

The elements of the set $$V$$ are called vectors.

(As an aside: There's a another way to think of vector spaces as well. For any ring $$R$$, there is a concept of an $$R$$-module which is similar to a group action: a module is a set with a ring action. That is to say, a ring pushing around objects in the set in a way that is compatible with both of the ring operations. From this viewpoint, a vector space is just a $$k$$-module, where the underlying set is a commutative group itself. As a result, $$R$$-modules is a generalization of vector spaces.)

As is traditional, we list some examples. Note that the vector space is a set and a field: usually, the choice of field is derived from context, but we'll be specific if the context is non-obvious. Often, we say that '$$V$$ is a vector space over $$k$$' to mean that $$V$$ is the commutative group and $$k$$ is the field.

1. $$\mathbb{k}^n$$ is the vector space whose underlying set is lists of $$n$$ elements of $$k$$, with coordinate-wise addition and $$k$$ acting by scalar multiplication. This gives rise to the familiar spaces $$\mathbb{R}^n$$ and $$\mathbb{C}^n$$. But we also know about finite fields now: $$\mathbb{Z}_p^n$$ where $$p$$ is prime is also a vector space.

2. The set of polynomials $$k[x]$$ in a single variable is a vector space over $$k$$.

3. Let $$M_{n,m}(k)$$ denote the set of $$n\times m$$ matrices with entries in $$k$$. Then $$M_{n,m}(k)$$ is a vector space over $$k$$.

4. Let $$V$$ be a vector space over $$k$$. Set $$V*$$ to be the set of functions from $$V$$ to $$k$$. (This is called the dual of $$V$$.) Addition of functions is given by $$(f+g)(x)=f(x)+g(x)$$, and scalar multiplication is given by $$(cf)(x)=c\cdot f(x)$$.

Exercise 9.0.1

1. For each of the above examples of vector spaces, write some example elements and give examples of addition and scalar multiplication in that vector space.
2. Prove that each of these examples is a vector space.

Some kinds of vector spaces only make sense with certain fields. Here's an example in the form of an exercise.

Exercise 9.0.2

Show that the set of continuous functions from $$\mathbb{R}\rightarrow \mathbb{R}$$ is a vector space over $$\mathbb{R}$$. (Be sure to explicitly identify what the operations of addition and scalar multiplication are.)

What extra condition would we need for a vector space $$V$$ over $$k$$ in order for the notion of continuous functions $$V\rightarrow k$$ to make sense?

## Contributors

• Tom Denton (Fields Institute/York University in Toronto)