# 4.1: Higher Order Differential Equations

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

Recall that the order of a differential equation is the highest derivative that appears in the equation. So far we have studied first and second order differential equations. Now we will embark on the analysis of higher order differential equations. We will restrict our attention to linear differential equations.

### Introduction

The general linear differential equation can be written as

\[ L(y)=\dfrac{\partial^ny }{\partial t} + p_1(t)\dfrac{\partial^{n-1}y }{\partial t} + ... + p_{1-n}(t)\dfrac{\partial y }{\partial t} + p_{n}(t)y = g(t). \]

The good news is that all the results from second order linear differential equation can be extended to higher order linear differential equations. We list without proof the results

If \(p_1\), ... \(p_n\) are continuous on an interval \([a,b]\) then there is a unique solution to the initial value problem, where instead of the initial conditions \(y(0) = y_0\) and \(y'(0) = y'_0\) , we need the initial conditions

\[y(0) = y_0, y'(0)=y'_0, y''(0)=y_0'' , ... , y^{(n-1)}(0) = y^{(n-1)}_0 .\]

\(L(y)\) is a linear transformation, that is

\[ L(c_1y_1+ x_2y_2 + ... + c_ny_n) = c_1L(y_1) + c_2L(y_2) + ... + c_nL(y_n).\]

If \(y_1\), \(y_2\) ... \(y_n\) are solutions to \(L(y) = 0\), then \(c_1y_1+ x_2y_2 + \, ...\, + c_ny_n\) is also a solution.

Definition: The Wronskian in Higher order equations

For differentiable functions \(y_1\), \(y_2\) ... \(y_n\), we define the *Wronskian* by

\[W(y_1,y_2,...,y_n) = \begin{vmatrix} y_1 & y_2 & \cdots & \cdots & y_n \\ y_1' & y_2' & \cdots & \cdots & y_n' \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ y_1^{(n-1)} & y_2^{(n-1)} &\cdots & \cdots & y_n^{(n-1)} \end{vmatrix} \]

Differentiable functions \(y_1\), \(y_2\) ... \(y_n\) are *linearly independent *if the Wronskian is nonzero for some \(t\) in \([a,b]\).

\[L(y) = 0 \;\; \text{has $n$ linearly independent solutions.}\]

If \(y_h\) is the general solution to \(L(y) = 0\) and if \(y_p\) is a particular solution to \(L(y) = g(t)\), then \(y_h + y_p\) is the general solution to \(L(y) = g(t)\).

Abel's theorem still holds. That is, if \(y_1, y_2, \cdots, y_n\) are linearly independent solutions of \(L(y) = 0\), then

\[W(y_1,y_2,\cdots, y_n) = ce^{-\int p_1(t)\; dt}. \]

The method of reduction of orders till works. If \(y_1\) is a solution to \(L(y_1)=0\), then \(L(v_1,y_1)\) can be reduced to a differential equation of degree \(n-1\).

Example \(\PageIndex{1}\)

Determine if the following functions are linearly independent.

- \( y_1 =e^{2x}\)
- \( y_2 =\sin{(3x)}\)
- \( y_3 =\cos x\)

**Solution**

First take derivatives

\[y_1' = 2e^{2x} \;\;\;\; y_2'=3\cos(3x) \;\;\;\; y_3' = -\sin(x) \\ y_1'' = 4e^{2x} \;\;\;\; y_2''=9\sin(3x) \;\;\;\; y_3' = -\cos(x) . \]

The Wronskian is

\[W(e^{2x}, \sin(3x), \cos x) = \begin{vmatrix} e^{2x} & \sin{(3x)} & \cos x \\ 2e^{2x} &3\cos(3x) &-\sin(x) \\4e^{2x} &-9\sin(3x) &-\cos(x) \end{vmatrix} \]

\[= e^{2x}(-3 \cos(3x) \cos x -9\sin(3x) \sin x) - \sin(3x)(-2e^x\cos x + 4e^{2x}\sin x) + \cos x(-18e^{2x}\sin(3x)-12e^{2x \cos(3x)}). \]

Now plug in \(x = 0\) (or any other value for \(x\)) to get

\[(1)(-3 - 0) - (0)(-2 + 0) + (1)(0 - 12) = -15.\]

In particular, since this is a nonzero number, we can conclude that the three functions are linearly independent.

Example \(\PageIndex{2}\): Applying Abel's theorem

Use Abel's theorem to find the Wronskian of the differential equation

\[ ty^{(iv)} + 2 y''' - t e^t y'' + (t^3 - 4t)y' + t^2 \sin t \;y = 0.\]

**Solution**

We first divide by \(t\) to get

\[ y^{(iv)} + \frac{2}{t} y''' - e^t y'' + (t^2 - 4)y' + t \sin t \;y = 0 .\]

Now take the integral of \(\dfrac{2}{t}\) to get

\[2\ln t.\]

The Wronskian is thus

\[ c e^{2\ln t} = c t^2 .\]

Larry Green (Lake Tahoe Community College)

Integrated by Justin Marshall.