14.5: Higher order linear ODEs
-
- Last updated
- Save as PDF
Equations that appear in applications tend to be second order, although higher order equations do appear from time to time. Hence, it is a generally assumed that the world is “second order” from a modern physics perspective. The basic results about linear ODEs of higher order are essentially the same as for second order equations, with 2 replaced by \(n\). The important concept of linear independence is somewhat more complicated when more than two functions are involved.
For higher order constant coefficient ODEs, the methods are also somewhat harder to apply, but we will not dwell on these complications. We can always use the methods for systems of linear equations to solve higher order constant coefficient equations. So let us start with a general homogeneous linear equation:
\[ y^{(n)} + p_{n-1}(x)y^{(n-1)} + \, ... + p_1(x)y' + p_o(x)y = f(x) \label{2.3.1} \]
Theorem \(\PageIndex{1}\)
Superposition
Suppose \(y_1, y_2, \dots , y_n\) are solutions of the homogeneous equation (Equation \ref{2.3.1}) . Then
\[ y(x) = C_1 y_1(x) + C_2 y_2(x) + ... + C_n y_n(x) \nonumber \]
also solves Equation \ref{2.3.1} for arbitrary constants \( C_1, .... C_n \) .
In other words, a linear combination of solutions to Equation \ref{2.3.1} is also a solution to Equation \ref{2.3.1}. We also have the existence and uniqueness theorem for nonhomogeneous linear equations.
Theorem \(\PageIndex{2}\)
Existence and Uniqueness
Suppose \( p_o\) through \( p_{n-1}\) , and \(f\) are continuous functions on some interval \( I , a \) is a number in \(I\) , and \( b_0, b_1, \dots , b_{n-1} \) are constants. The equation
\[ y^{(n)} + p_{n-1}(x)y^{(n-1)} + \, ... + p_1(x)y' + p_o(x)y = f(x) \nonumber \]
has exactly one solution \( y(x) \) defined on the same interval \(I\) satisfying the initial conditions
\[ y(a) = b_0, \quad y'(a) = b_1,\quad \dots ,\quad y^{(n -1)} (a) = b_{n - 1} \nonumber \]
Linear Independence
When we had two functions \(y_1\) and \(y_2\) we said they were linearly independent if one was not the multiple of the other. Same idea holds for \(n\) functions. In this case it is easier to state as follows. The functions \(y_1, y_2, \dots , y_n\) are linearly independent if
\[ c_1y_1 + c_2y_2 + \dots + c_ny_n = 0 \nonumber \]
has only the trivial solution \( c_1 = c_2 = \dots = c_n = 0 \), where the equation must hold for all \(x\). If we can solve equation with some constants where for example \(c_1 \ne 0 \), then we can solve for \(y_1\) as a linear combination of the others. If the functions are not linearly independent, they are linearly dependent .
Example \(\PageIndex{1}\)
Show that \( e^x\), \(e^{2x}\), and \(e^{3x}\) are linearly independent functions.
Solution
Let us give several ways to show this fact. Many textbooks introduce Wronskians, but that is really not necessary to solve this example. Let us write down
\[ c_1e^x + c_2e^{2x} + c_3e^{3x} = 0 \nonumber \]
We use rules of exponentials and write \( z = e^x \). Then we have
\[ c_1z + c_2z^2 + c_3z^3 = 0 \nonumber \]
The left hand side is a third degree polynomial in \(z\). It can either be identically zero, or it can have at most 3 zeros. Therefore, it is identically zero, \(c_1 = c_2 = c_3 = 0 \), and the functions are linearly independent.
Let us try another way. As before we write
\[ c_1e^x + c_2e^{2x} + c_3e^{3x} = 0 \nonumber \]
This equation has to hold for all \(x\). What we could do is divide through by \(e^{3x}\) to get
\[ c_1e^{-2x} + c_2e^{-x} + c_3 = 0 \nonumber \]
As the equation is true for all \(x\), let \( x \rightarrow \infty \). After taking the limit we see that \( c_3 = 0 \). Hence our equation becomes
\[ c_1e^x + c_2e^{2x} = 0 \nonumber \]
Rinse, repeat!
How about yet another way. We again write
\[ c_1e^x + c_2e^{2x} + c_3e^{3x} = 0 \nonumber \]
We can evaluate the equation and its derivatives at different values of \(x\) to obtain equations for \(c_1\), \(c_2\), and \(c_3\). Let us first divide by \(e^x \) for simplicity.
\[ c_1 + c_2e^x + c_3e^{2x} = 0 \nonumber \]
We set \( x = 0 \) to get the equation \(c_1 + c_2 + c_3 = 0 \). Now differentiate both sides
\[ c_2 e^x + 2c_3e^{2x} = 0 \nonumber \]
We set \( x = 0 \) to get \(c_2 + 2c_3 = 0 \). We divide by \(e^x\) again and differentiate to get \( 2c_3e^x = 0 \). It is clear that \(c_3\) is zero. Then \(c_2\) must be zero as \(c_2 = -2c_3 \), and \(c_1\) must be zero because \( c_1 + c_2 + c_3 = 0 \).
There is no one best way to do it. All of these methods are perfectly valid. The important thing is to understand why the functions are linearly independent.
Example \(\PageIndex{2}\)
On the other hand, the functions \( e^x, e^{-x} \) and \( \cosh x \) are linearly dependent. Simply apply definition of the hyperbolic cosine:
\[ \cosh x = \frac {e^x + e^{-x}}{2} \quad\text{or}\quad 2 \cosh x - e^x - e^{-x} = 0 \nonumber \]
Constant Coefficient Higher Order ODEs
When we have a higher order constant coefficient homogeneous linear equation, the song and dance is exactly the same as it was for second order. We just need to find more solutions. If the equation is \( n^{th} \) order we need to find \(n\) linearly independent solutions. It is best seen by example.
Example \(\PageIndex{3}\): Third order ODE with Constant Coefficients
Find the general solution to
\[ \label{eq:15}y''' - 3'' - y' + 3y =0 \]
Solution
Try: \( y = e^{rx} \). We plug in and get
\[\underbrace{r^3 e^{rx}}_{y'''} - 3 \underbrace{r^2 e^{rx}}_{y''} - \underbrace{r e^{rx}}_{y'} + 3 \underbrace{e^{rx}}_{y} = 0 . \nonumber \]
We divide through by \(e^{rx}\). Then
\[ r^3 - 3r^2 - r +3 =0 \nonumber \]
The trick now is to find the roots. There is a formula for the roots of degree 3 and 4 polynomials, but it is very complicated. There is no formula for higher degree polynomials. That does not mean that the roots do not exist. There are always \( n\) roots for an \( n^{th}\) degree polynomial. They may be repeated and they may be complex. Computers are pretty good at finding roots approximately for reasonable size polynomials.
A good place to start is to plot the polynomial and check where it is zero. We can also simply try plugging in. We just start plugging in numbers \( r = -2, -1, 0, 1, 2, \dots \) and see if we get a hit (we can also try complex numbers). Even if we do not get a hit, we may get an indication of where the root is. For example, we plug \(r = -2\) into our polynomial and get -15; we plug in \( r = 0\) and get 3. That means there is a root between \(r = -2\) and \(r = 0 \), because the sign changed. If we find one root, say \(r_1\), then we know \( (r - r_1) \) is a factor of our polynomial. Polynomial long division can then be used.
A good strategy is to begin with \( r = -1\), 1, or 0. These are easy to compute. Our polynomial happens to have two such roots, \(r_1 = -1\) and \(r_2 = 1 \) and. There should be three roots and the last root is reasonably easy to find. The constant term in a monic \(^{1}\) polynomial such as this is the multiple of the negations of all the roots because \( r^3 - 3r^2 - r + 3 = (r - r_1)(r - r_2)(r - r_3)\). So
\[ 3 = (-r_1)(-r_2)(-r_3) = (1)(-1)(-r_3) = r_3 \nonumber \]
You should check that \(r_3 = 3\) really is a root. Hence we know that \(e^{-x}\), \(e^{x}\), and \(e^{3x} \) are solutions to \(\eqref{eq:15}\). They are linearly independent as can easily be checked, and there are three of them, which happens to be exactly the number we need. Hence the general solution is
\[ y = C_1e^{-x} + C_2e^{x} + C_3e^{3x} \nonumber \]
Suppose we were given some initial conditions \( y(0) = 1, y'(0) = 2\), and \(y''(0) = 3 \). Then
\[\begin{align}\begin{aligned} 1 &= y(0) = C_1 + C_2 + C_3 \\ 2 &= y'(0) = -C_1 + C_2 + 3C_3 \\ 3 &= y''(0) = C_1 + C_2 + 9C_3 \end{aligned}\end{align} \nonumber \]
It is possible to find the solution by high school algebra, but it would be a pain. The sensible way to solve a system of equations such as this is to use matrix algebra, see Section 3.2 or Appendix A. For now we note that the solution is \( C_1 = - \frac {1}{4}\), \(C_2 = 1\), and \(C_3 = \frac {1}{4} \). The specific solution to the ODE is
\[ y = - \frac {1}{4} e^{-x} + e^{x} + \frac {1}{4} e^{3x} \nonumber \]
Next, suppose that we have real roots, but they are repeated. Let us say we have a root \(r\) repeated \(k\) times. In the spirit of the second order solution, and for the same reasons, we have the solutions
\[ e^{rx}, xe^{rx}, x^2e^{rx}, \dots , x^{k-1}e^{rx} \nonumber \]
We take a linear combination of these solutions to find the general solution.
Example \(\PageIndex{4}\)
Solve
\[ y^{(4)} - 3y''' + 3y'' - y' = 0 \nonumber \]
Solution
We note that the characteristic equation is
\[ r^4 - 3r^3 + 3r^2 - r = 0 \nonumber \]
By inspection we note that \( r^4 - 3r^3 + 3r^2 - r = r{(r - 1)}^3 \). Hence the roots given with multiplicity are \( r = 0, 1, 1, 1 \). Thus the general solution is
\[ y = \underbrace { (C_1 + C_2 + C_3x^2)e^x}_{\text {terms coming from r = 1}} + \underbrace { C_4}_{ \text {from r = 0} } \nonumber \]
The case of complex roots is similar to second order equations. Complex roots always come in pairs \( r = \alpha \pm i\beta \). Suppose we have two such complex roots, each repeated \(k\) times. The corresponding solution is
\[ (C_0 + C_1x + \dots + C_{k-1} x^{k-1})e^{ax} \cos ( \beta x) + ( D_0 + D_1x + \dots + D_{k - 1}x^{k - 1} ) e^{ax} \sin ( \beta x) \nonumber \]
where \( C_0, \dots , C_{k-1} , D_0, \dots, D_{k-1} \) are arbitrary constants.
Below is a video on finding the solution to a differential equation given initial values.
Example \(\PageIndex{5}\)
Solve
\[ y^{(4)} - 4y''' + 8y'' - 8y' + 4y = 0 \nonumber \]
Solution
The characteristic equation is
\[\begin{align}\begin{aligned} r^4 - 4r^3 + 8r^2 - 8r + 4 &= 0 \\ {(r^2 - 2r + 2)}^2 &= 0 \\ {({( r - 1)}^2 + 1 )}^2 &= 0 \end{aligned}\end{align} \nonumber \]
Hence the roots are \( 1 \pm i\), both with multiplicity 2. Hence the general solution to the ODE is
\[ y = ( C_1 + C_2x)e^x \cos x + ( C_3 + C_4 x ) e^x \sin x \nonumber \]
The way we solved the characteristic equation above is really by guessing or by inspection. It is not so easy in general. We could also have asked a computer or an advanced calculator for the roots.
Footnotes
[1] The word monic means that the coefficient of the top degree \(r^{d}\), in our case \(r^{3}\), is \(1\).
Outside Links
- After reading this lecture, it may be good to try Project III from the IODE website: www.math.uiuc.edu/iode/ .