# 2.5: Nonhomogeneous Equations

- Page ID
- 354

## 2.5.1 Solving Nonhomogeneous Equations

We have solved linear constant coefficient homogeneous equations. What about nonhomogeneous linear ODEs? For example, the equations for forced mechanical vibrations. That is, suppose we have an equation such as

\[ y'' + 5y' + 6y = 2x + 1 \label{2.5.1}\]

We will write \( Ly = 2x + 1\) when the exact form of the operator is not important. We solve (Equation \ref{2.5.1}) in the following manner. First, we find the general solution \(y_c\) to the associated homogeneous equation

\[ y'' + 5y' + 6y = 0 \]

We call \(y_c\) the* complementary solution*. Next, we find a single particular solution \(y_p\) to (2.5.1) in some way. Then

\[ y = y_c + y_p \]

is the general solution to (2.5.1). We have \( Ly_c = 0 \) and \(Ly_p = 2x +1 \). As \(L\) is a linear operator we verify that \(y\) is a solution, \(Ly = L(y_c + y_p) = Ly_c + Ly_p = 0 + ( 2x + 1) \). Let us see why we obtain the general solution.

Let \(y_p\) and \( {\tilde {y}}_p\) be two different particular solutions to (2.5.1). Write the difference as \(w = y_p - {\tilde {y}}_p\). Then plug \(w\) into the left hand side of the equation to get

\[ w'' + 5w' + 6w = ( y''_p + 5y'_p + 6y_p ) - (\tilde {y}_p'' +5 \tilde {y}_p' +6 \tilde {y}_p)= ( 2x + 1) - (2x + 1) = 0 \]

Using the operator notation the calculation becomes simpler. As \(L\) is a linear operator we write

\[ Lw = L (y_p - {\tilde {y}}_p ) = L y_p - L {\tilde {y}}_p = (2x + 1) - ( 2x + 1) = 0 \]

So \(w = y_p - {\tilde {y}}_p \) is a solution to (2.5.2), that is \( Lw = 0 \). Any two solutions of (2.5.1) differ by a solution to the homogeneous equation (2.5.2). The solution \( y = y_c + y_p \) includes all solutions to (2.5.1), since \(y_c\) is the general solution to the associated homogeneous equation.

Theorem 2.5.1

*Let* \( Ly = f(x) \) *be a linear ODE* (*not necessarily constant coefficient*). Let \(y_c \) *be the **complementary solution* (*the general solution to the associated homogeneous equation* \(Ly = 0 \)) *and**let* \(y_p\) *be any particular solution to* \( Ly = f(x) \). *Then the general solution to* \(Ly = f(x) \) *is*

\[ y = y_c + y_p . \]

The moral of the story is that we can find the particular solution in any old way. If we find a different particular solution (by a different method, or simply by guessing), then we still get the same general solution. The formula may look different, and the constants we will have to choose to satisfy the initial conditions may be different, but it is the same solution.

### 2.5.2 Undetermined Coefficients

The trick is to somehow, in a smart way, guess one particular solution to (2.5.1). Note that \( 2x + 1 \) is a polynomial, and the left hand side of the equation will be a polynomial if we let \(y\) be a polynomial of the same degree. Let us try

\[ y_p = Ax + B \]

We plug in to obtain

\[ y_p'' + 5y_p' + 6y_p = (Ax + B)'' + 5( Ax + B)' + 6(Ax + B) = 0 + 5A + 6Ax + 6B = 6Ax + ( 5A + 6B) \]

So \( 6Ax + (5A + 6B) = 2x + 1 \). Therefore, \(A = \dfrac {1}{3} \) and \( B = - \dfrac {1}{9} \). That means \( y_p = \dfrac {1}{3} x - \dfrac {1}{9} = \dfrac {3x - 1}{9} \). Solving the complementary problem (exercise!) we get

\[ y_c = C_1e^{-2x} + C_2 e^{-3x} \]

Hence the general solution to (2.5.1) is

\[ y = C_1 e^{-2x} + C_2 e^{-3x} + \dfrac {3x - 1}{9} \]

Now suppose we are further given some initial conditions. For example, \(y(0) = 0 \) and \(y' (0) = \dfrac {1}{3} \). First find \( y' = -2C_1e^{-2x} - 3C_2e^{-3x} + \dfrac {1}{3} \). Then

\[ 0 = y(0) = C_1 + C_2 - \dfrac {1}{9}, \dfrac {1}{3} = y'(0) = -2C_1 - 3C_2 + \dfrac {1}{3} \]

We solve to get \( C_1 = \dfrac {1}{3} \) and \( C_2 = - \dfrac {2}{9} \). The particular solution we want is

\[ y(x) = \dfrac {1}{3} e^{-2x} - \dfrac {2}{9} e^{-3x} + \dfrac {3x - 1}{9} = \dfrac {3e^{-2x} - 2e^{-3x} + 3x - 1}{9} \]

Exercise \(\PageIndex{1}\)

Check that \(y\) really solves the equation (2.5.1) and the given initial conditions.

Note: A common mistake is to solve for constants using the initial conditions with \(y_c\) and only add the particular solution \(y_p\) after that. That will not work. You need to first compute \(y = y_c + y_p \) and only then solve for the constants using the initial conditions.

A right hand side consisting of exponentials, sines, and cosines can be handled similarly. For example,

\[ y'' + 2y' + 2y = \cos (2x) \]

Let us find some \(y_p\). We start by guessing the solution includes some multiple of \( \cos (2x) \). We may have to also add a multiple of \( \sin (2x) \) to our guess since derivatives of cosine are sines. We try

\[ y_p = A \cos (2x) + B \sin (2x) \]

We plug \( y_p\) into the equation and we get

\[-4A \cos (2x) - 4B \sin (2x) - 4A \sin (2x) + 4B \cos (2x) + 2A \cos (2x) + 2B \sin (2x) = \cos (2x) \]

The left hand side must equal to right hand side. We group terms and we get that \( -4A + 4B + 2A = 1\) and \(-4B - 4A + 2B = 0 \). So \( -2A + 4B = 1 \) and \(2A + B = 0 \) and hence \( A = \dfrac {-1}{10} \) and \( B = \dfrac {1}{5} \). So

\[ y_p = A \cos (2x) + B \sin (2x) = \dfrac { - \cos (2x) + 2 \sin (2x) }{10} \]

Similarly, if the right hand side contains exponentials we try exponentials. For example, for

\[ Ly = e^{3x} \]

we will try \( y = Ae^{3x} \) as our guess and try to solve for \( A\).

When the right hand side is a multiple of sines, cosines, exponentials, and polynomials, we can use the product rule for differentiation to come up with a guess. We need to guess a form for \(y_p\) such that \( Ly_p\) is of the same form, and has all the terms needed to for the right hand side. For example,

\[ Ly = ( 1 + 3x^2) e^{-x} \cos ( \pi x ) \]

For this equation, we will guess

\[ y_p = ( A + Bx + Cx^2)e^{-x} \cos (\pi x) + ( D + Ex + Fx^2 ) e^{-x} \sin ( \pi x ) \]

We will plug in and then hopefully get equations that we can solve for \( A , B, C, D, E \)and \( F \). As you can see this can make for a very long and tedious calculation very quickly.

There is one hiccup in all this. It could be that our guess actually solves the associated homogeneous equation. That is, suppose we have

\[ y'' - 9y = e^{3x} \]

We would love to guess \(y = Ae^{3x} \), but if we plug this into the left hand side of the equation we get

\[y'' - 9y = 9Ae^{3x} - 9Ae^{3x} = 0 \ne e^{3x} \]

There is no way we can choose \(A\) to make the left hand side be \( e^{3x}\). The trick in this case is to multiply our guess by \(x\) to get rid of duplication with the complementary solution. That is first we compute \(y_c\) (solution to \(Ly = 0\))

\[ y_c = C_1e^{-3x} + C_2 e^{3x} \]

and we note that the \( e^{3x} \) term is a duplicate with our desired guess. We modify our guess to \( y = Axe^{3x} \) and notice there is no duplication anymore. Let us try. Note that \(y' = Ae^{3x} + 3Axe^{3x} \) and \( y'' = 6Ae^{3x} + 9Axe^{3x} \). So

\[y'' - 9y = 6Ae^{3x} + 9Axe^{3x} - 9Axe^{3x} = 6Ae^{3x} \]

Thus \( 6Ae^{3x}\) is supposed to equal \(e^{3x}\). Hence, \(6A = 1\) and so \(A = \dfrac {1}{6} \). We can now write the general solution as

\[ y = y_c + y_p = C_1e^{-3x} + C_2e^{3x} + \dfrac {1}{6} xe^{3x} \]

It is possible that multiplying by \(x\) does not get rid of all duplication. For example,

\[ y'' - 6y' + 9y = e^{3x} \]

The complementary solution is \( y_c = C_1e^{3x} + C_2xe^{3x} \). Guessing \(y = Axe^{3x} \) would not get us anywhere. In this case we want to guess \(y_p = Ax^2e^{3x} \). Basically, we want to multiply our guess by \(x\) until all duplication is gone. But no more! Multiplying too many times will not work.

Finally, what if the right hand side has several terms, such as

\[ Ly = e^{2x} + \cos x \]

In this case we find \(u\) that solves \(Lu = e^{2x} \) and \(v\) that solves \(Lv = \cos x\) (that is, do each term separately). Then note that if \(y = u + v\), then \(Ly = e^{2x} + \cos x\). This is because \(L\) is linear; we have \( Ly = L(u + v) = Lu + Lv = e^{2x} + \cos x \).

### 2.5.3 Variation of Parameters

The method of undetermined coefficients will work for many basic problems that crop up. But it does not work all the time. It only works when the right hand side of the equation \(Ly = f(x) \) has only finitely many linearly independent derivatives, so that we can write a guess that consists of them all. Some equations are a bit tougher. Consider

\[ y'' + y = \tan x\]

Note that each new derivative of \( \tan x\) looks completely different and cannot be written as a linear combination of the previous derivatives. We get \( \sec ^2 x, 2 \sec ^2 x \tan x, \text {etc} \dots \)

This equation calls for a different method. We present the method of variation of parameters, which will handle any equation of the form \( Ly = f(x) \), provided we can solve certain integrals. For simplicity, we restrict ourselves to second order constant coefficient equations, but the method works for higher order equations just as well (the computations become more tedious). The method also works for equations with nonconstant coefficients, provided we can solve the associated homogeneous equation.

Perhaps it is best to explain this method by example. Let us try to solve the equation

\[ Ly = y'' + y = \tan x\]

First we find the complementary solution (solution to \( Ly_c = 0\)). We get \(y_c = C_1y_1 + C_2y_2\), where \( y_1 = \cos x \) and \( y_2 = \sin x\). To find a particular solution to the nonhomogeneous equation we try

\[ y_p = y = u_1y_1 + u_2y_2 \]

where \(u_1\) and \( u_2\) are functions and not constants. We are trying to satisfy \(Ly = \tan x\). That gives us one condition on the functions \(u_1\) and \(u_2\). Compute (note the product rule!)

\[ y' = (u'_1y_1 + u'_2y_2) + (u_1y'_1 + u_2y'_2) \]

We can still impose one more condition at our discretion to simplify computations (we have two unknown functions, so we should be allowed two conditions). We require that \( (u'_1y_1 + u'_2y_2) = 0 \). This makes computing the second derivative easier.

\[y' = u_1y'_1 + u_2y'_2\]

\[y'' = (u'_1y'_1 + u'_2y'_2) + (u_1y''_1 + u_2y''_2) \]

We have \((u_1y_1 + u_2y_2) = y \) and so

\[ y'' = (u'_1y'_1 + u'_2y'_2) - y \]

and hence

\[ y'' + y = Ly = u'_1y'_1 + u'_2y'_2\]

For \(y\) to satisfy \(Ly = f(x) \) we must have \( f(x) =u'_1y'_1 + u'_2y'_2\).

So what we need to solve are the two equations (conditions) we imposed on \(u_1\) and \(u_2\)

\[ u'_1y_1 + u'_2y_2 = 0 \]

\[ u'_1y'_1 + u'_2y'_2 = f(x) \]

We can now solve for \(u'_1\) and \(u'_2\) in terms of \(f(x), y_1\) and \(y_2\). We will always get these formulas for any \( Ly = f(x) \), where \( Ly = y'' + p(x)y' + q(x)y \). There is a general formula for the solution we can just plug into, but it is better to just repeat what we do below. In our case the two equations become

\[u'_1 \cos (x) + u'_2 \sin (x) = 0 \]

\[-u'_1 \sin (x) + u'_2 \cos (x) = \tan (x) \]

Hence

\[ u'_1 \cos (x) \sin (x) + u'_2 {\sin}^2 (x) = 0 \]

\[ -u'_1 \sin (x) \cos (x) + u'_2 {\cos}^2 (x) = \tan (x) \cos (x) = \sin (x) \]

\[ u'_2 ( {\sin}^2 (x) + {\cos}^2 (x)) = \sin (x) \]

\[u'_2 = \sin (x) \]

\[ u'_1 = \dfrac {-{\sin}^2 (x)}{ \cos (x)} = - \tan (x) \sin (x) \]

Now we need to integrate \(u'_1\) and \(u'_2\) to get \(u_1\) and \(u_2\).

\[ u_1 = \int u'_1 dx = \int - \tan (x) \sin (x) dx = \dfrac {1}{2} \ln \mid \dfrac { \sin (x) - 1}{ \sin (x) + 1 } \mid + \sin (x) \]

\[ u_2 = \int u'_2 dx = \int \sin (x) dx = - \cos (x) \]

\[ y_p = u_1y_1 + u_2y_2 = \dfrac {1}{2} \cos (x) \ln \mid \dfrac { \sin (x) - 1}{ \sin (x) + 1} \mid + \cos (x) \sin (x) - \cos (x) \sin (x) = \dfrac {1}{2} \cos (x) \ln \mid \dfrac { \sin (x) - 1}{ \sin (x) + 1} \mid \]

The general solution to \( y'' + y = \tan x\) is, therefore,

\[ y = C_1 \cos (x) + C_2 \sin (x) + \dfrac {1}{2} \cos (x) \ln \mid \dfrac {\sin (x) - 1}{\sin (x) + 1} \mid \]

## Contributors

- Jiří Lebl (Oklahoma State University).These pages were supported by NSF grants DMS-0900885 and DMS-1362337.