Skip to main content
\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)
Mathematics LibreTexts

2.1: Second order linear ODEs

Let us consider the general second order linear differential equation

\[ A(x)y'' + B(x)y' + C(x)y = F(x).\]

We usually divide through by \(A(x)\) to get

\[ y'' + p(x)y' + q(x)y = f(x), \]

where \(p(x) = \frac{B(x)}{A(x)}\), \(q(x)=\frac{C(x)}{A(x)}\), and \(f(x) = \frac{F(x)}{A(x)} \). The word linear means that the equation contains no powers nor functions of \(y\), \(y'\), and \(y''\).

In the special case when \(f(x)=0\) we have a so-called homogeneous equation

\[ y'' + p(x)y' + q(x)y = 0, \]

We have already seen some second order linear homogeneous equations:

  1. \(y'' + k^2y = 0\) with two solutions of \( y_1 = \cos(kx)\) and \(y_2=\sin(kx)\)
  2. \(y'' - k^2y = 0\) with two solutions of \(y_1 = e^{kx}\) and \(y_2=e^{-kx}\).

If we know two solutions of a linear homogeneous equation, we know a lot more of them.

Theorem 2.1.1 (Superposition). Suppose \(y_1\) and \(y_2\) are two solutions of the homogeneous equation (2.1.3). Then

\[ y(x) = C_1y_1(x) + C_2y_2(x),\]

also solves (2.1.3) for arbitrary constants \(C_1\) and \(C_2\).

That is, we can add solutions together and multiply them by constants to obtain new and different solutions. We call the expression \( C_1y_1+C_2y_2\) a linear combination of \(y_1\) and \(y_2\). Let us prove this theorem; the proof is very enlightening and illustrates how linear equations work.

Proof.  Let \( y = C_1y_1 + C_2y_2\). Then

\[ y'' + py' + qy = (C_1y_1 + C_2y_2)'' + p(C_1y_1 + C_2y_2)' + q(C_1y_1 + C_2y_2)\]

\[ = C_1y''_1 + C_2y''_2 + C_1py'_1 + C_2py'_2 + C_1qy_1 + C_2qy_2\]

\[ = C_1(y''_1 + py'_1 + qy_1) + C_2(y''_2 + py'_2 + qy_2) \]

\[ = C_1.0 + C_2.0 = 0 \]

The proof becomes even simpler to state if we use the operator notation. An operator is an object that eats functions and spits out functions (kind of like what a function, which eats numbers and spits out numbers). Define the operator \(L\) by

\[Ly=y''+py'+qy.\]

The differential equation now becomes \(Ly=0\). The operator (and the equation) \(L\) being linear means that \( L(C_1y_1 + C_2y_2) = C_1Ly_1 + C_2Ly_2\). The proof above becomes

\[Ly = L(C_1y_1 + C_2y_2) = C_1Ly_1 + C_2Ly_2 = C_1.0 + C_2.0 = 0\]

     Two different solutions to the second equation \(y'' - k^2y = 0 \) are \(y_1 = \cosh(kx)\) and \(y_2 = \sinh (kx) \). Let us remind ourselves of the definition, \( \cosh x = \frac {e^x + e^{-x}}{2}\) and \( \sinh x = \frac {e^x - e^{-x}}{2} \). Therefore, these are solutions by superposition as they are linear combinations of the two exponential solutions.

The functions \(\sinh\) and \(\cosh\) are sometimes more convenient to use than the exponential. Let us review some of their properties.

\[ \cosh 0 =1   ~~~~~~      \sinh 0 =0\]

\[ \frac {d}{dx} \cosh x = \sinh x     ~~~~~~     \frac {d}{dx} \sinh x = \cosh x \]

\[ \cosh^2 x - \sinh^2 x = 1\] 

Exercise \(\PageIndex{1}\):

Derive these properties using the definitions of \( \sinh\) and \( \cosh\) in terms of exponentials.

Linear equations have nice and simple answers to the existence and uniqueness question.

Theorem 2.1.2 (Existence and uniqueness)Suppose \(p(x)\), \(q(x)\), and \(f(x)\) are continuous functions on some interval \(I\) containing \(a\) with \(a\), \(b_0\) and \(b_1\) constants. The equation

\[y'' + p(x)y' + q(x)y = f(x).\]

has exactly one solution \(y(x)\) defined on the same interval \(I\) satisfying the initial conditions

\( y(a) = b_0\)  and  \(y'(a) = b_1\).

 For example, the equation \( y'' + k^2y = 0 \) with \( y(0) = b_0 \) and \(y'(0) = b_1\) has the solution

\[ y(x) = b_0 \cos (kx) + \frac {b_1}{k} \sin (kx) \]

The equation \( y'' - k^2y = 0 \) with \( y(0) = b_0 \) and \( y'(0) = b_1\) has the solution

\[ y(x) = b_0 \cosh (kx) + \frac {b_1}{k} \sinh (kx) \]

Using \( \cosh\) and \( \sinh\) in this solution allows us to solve for the initial conditions in a cleaner way than if we have used the exponentials.

 The initial conditions for a second order ODE consist of two equations. Common sense tells us that if we have two arbitrary constants and two equations, then we should be able to solve for the constants and find a solution to the differential equation satisfying the initial conditions.

Question: Suppose we find two different solutions \(y_1\) and \(y_2\) to the homogeneous equation (2.1.3). Can every solution be written (using superposition) in the form \( y = C_1y_1 + C_2y_2\)?

Answer is affirmative! Provided that \(y_1\) and \(y_2\) are different enough in the following sense. We will say \(y_1\) and \( y_2\)  are linearly independent if one is not a constant multiple of the other.

Theorem 2.1.3.  Let \(p(x)\) and \(q(x)\) be continuous functions and let \(y_1\) and \(y_2\) be two linearly independent solutions to the homogeneous equation (2.1.3). Then every other solution is of the form

\[y=C_1y_1 + C_2y_2.\]

Theorem 2.1.3 basically says that the general solution of the ODE are \(y=C_1y_1 + C_2y_2\). For example, we found the solutions \( y_1 = \sin x\) and \( y_2 = \cos x \) for the equation \( y'' + y = 0 \). It is not hard to see that sine and cosine are not constant multiples of each other. If \( \sin x = A \cos x \) for some constant \(A\), we let \(x=0\) and this would imply \(A=0\). But then \( \sin x = 0 \) for all \( x\), which is preposterous. So \(y_1\) and \(y_2\) are linearly independent and \[ y= C_1\cos x + C_2\sin x\] is the general solution to \(y'' + y =0\).

We will study the solution of nonhomogeneous equations in § 2.5. We will first focus on finding general solutions to homogeneous equations.

Contributors