
# 5.4: Theory of Systems of Differential Equations

$$\newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} }$$

$$\newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$

It turns out that the theory of systems of linear differential equations resembles the theory of higher order differential equations. This discussion will adopt the following notation. Consider the system of differential equations

\begin{align*} & x_1'= p_{11}(t)x_1+\dots+ p_{1n}(t)+ g_1(t) \\ & \vdots \qquad \vdots \qquad \qquad \qquad \;\;\; \vdots \qquad \quad \;\; \vdots \\ & x_n'= p_{n1}(t)x_1+\dots+ p_{nn}(t)+ g_n(t). \end{align*}

We write this system as

$\textbf{x}' = \textbf{P}(t)\textbf{x} + \textbf{g}(t).$

A vector $$\textbf{x} = \textbf{f}(t)$$ is a solution of the system of differential equation if

$\textbf(f)'=\textbf{P}(t)\textbf{f}+\textbf{g}(t).$

If $$\textbf{g}(t) = 0$$ the system of differential equations is called homogeneous. Otherwise, it is called nonhomogeneous.

Theorem: The Solution Space is a Vector Space

Suppose that $$\textbf{x}^{(1)}$$, $$\textbf{x}^{(2)}$$, ... , $$\textbf{x}^{(k)}$$ are solutions to the homogeneous system of differential equations

$\textbf{x}' = \textbf{P}(t)\textbf{x}$

then

$c_1\textbf{x}^{(1)} + c_2\textbf{x}^{(2)} + \, ...\, + c_k\textbf{x}^{(k)}$

is also a solution for any constants $$c_1$$, $$c_2$$, ... , $$c_k$$.

Just as we had the Wronskian for higher order linear differential equations, we can define a similar beast for systems of linear differential equations.

If

$\textbf{x}^{(1)}, \textbf{x}^{(2)},\dots, \textbf{x}^{(n)}$

are $$n$$ solutions of an $$n \times n$$ system, then the Wronskian of this set is the determinant of the matrix whose $$i^{th}$$ column is $$\textbf{x}^{(i)}$$.

Example $$\PageIndex{1}$$

Let

$\textbf{x}^{(1)}=\begin{pmatrix} e^t \\ e^{-t} \end{pmatrix}, \;\;\; x^{(2)}=\begin{pmatrix} 2e^{(t)} \\ 3e^{(-t)} \end{pmatrix}. \nonumber$

Then

$W(t)=\begin{vmatrix} e^t &2e^t \\ e^{-t} &3e^{-t} \end{vmatrix} = 3-2 =1 . \nonumber$

It is a direct consequence from linear algebra that solutions are linearly independent if and only if the Wronskian is nonzero. In fact, more is true. There is a generalizations of Abel's Theorem for systems of linear differential equations.

$\dfrac{W}{dt} = (p_{11} + p_{22} + ... p_{nn})W$

The main theorem on uniqueness and existence of solutions of systems of differential equations also holds true. We state it below.

Theorem: Existence and Uniqueness for Systems

Let

$\textbf{x}' = \textbf{P}(t)\textbf{x}$

be a differential equation with $$p_{ij}$$ continuous for all $$i$$ and $$j$$ on the interval $$a < t < b$$. Then there exists $$n$$ unique linearly independent solutions. If the initial value

$\textbf{x}(0) = \textbf{x}_0$

is given, then there exists a unique solution in the interval $$(a,b)$$.

In particular, if

$\textbf{x}^{(1)}, \textbf{x}^{(2)},\dots, \textbf{x}^{(n)}$

are solutions of the homogeneous system, and if the Wronskian is nonzero, then

$\textbf{y}=c_1\textbf{x}^{(1)} + c_2\textbf{x}^{(2)}+\dots+c_k\textbf{x}^{(k)}$

is the general solution to the system. We call $$\textbf{x}^{(1)}, \textbf{x}^{(2)},\dots, \textbf{x}^{(n)}$$ a fundamental set of solutions to the system of differential equations.

In particular, if the Wronskian matrix at $$t_0$$ is the identity matrix ($$W(t_0) = I$$) then its determinant is one, hence not zero. This gives us the following theorem.

Theorem

Let

$e^{(1)}=\begin{pmatrix} 1\\0\\0\\ \vdots \\ 0 \end{pmatrix}, e^{(2)}=\begin{pmatrix} 0\\1\\0\\ \vdots \\ 0 \end{pmatrix}, \dots, e^{(n)}=\begin{pmatrix} 0\\0\\0\\ \vdots \\ 1 \end{pmatrix}.$

If $$\textbf{x}^{(1)}, \textbf{x}^{(2)},\dots, \textbf{x}^{(n)}$$ are solutions to the homogeneous system of differential equations satisfying the conditions

$\textbf{x}^{(1)}(t_0)=e^{(1)}, \textbf{x}^{(2)}(t_0)=e^{(2)},\dots,\textbf{x}^{(n)}(t_0)=e^(n)$

then they form a fundamental set of solutions.