Skip to main content
Mathematics LibreTexts

1.1: 1.1 Review of the First Course

  • Page ID
    106196
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    In this section we review a few of the solution techniques encountered in a first course in differential equations. We will not review the basic theory except in possible references as reminders as to what we are doing.

    We first recall that an \(n\)-th order ordinary differential equation is an equation for an unknown function \(y(x)\) that expresses a relationship between the unknown function and its first \(n\) derivatives. One could write this generally as

    \[F(y^{(n)}(x),y^{(n-1)}(x),...,y'(x),y(x),x) = 0 \label{1.1} \]

    Here \(y^{(n)}(x)\) represents the \(n\)th derivative of \(y(x)\).

    An initial value problem consists of the differential equation plus the values of the first \(n-1\) derivatives at a particular value of the independent variable, say \(x_0\):

    \[y^{(n-1)}(x_0) = y_{n-1}, y^{(n-2)}(x_0) = y_{n-2}, ..., y(x_0) = y_0 \label{1.2} \]

    A linear \(n\)th order differential equation takes the form

    \[a_n(x)y^{(n)}(x)+a_{n-1}(x)y^{(n-1)}(x)+...+a_1(x)y'(x)+a_0(x)y(x)) = f(x) \label{1.3} \]

    If \(f(x) \equiv 0\), then the equation is said to be homogeneous, otherwise it is nonhomogeneous.

    1.1.1 First Order Differential Equations

    Typically, the first differential equations encountered are first order equations. A first order differential equation takes the form

    \[F(y',y,x) = 0 \label{1.4} \]

    There are two general forms for which one can formally obtain a solution. The first is the separable case and the second is a first order equation. We indicate that we can formally obtain solutions, as one can display the needed integration that leads to a solution. However, the resulting integrals are not always reducible to elementary functions nor does one obtain explicit solutions when the integrals are doable.

    A first order equation is separable if it can be written the form

    \[\dfrac{dy}{dx} = f(x)g(y) \label{1.5} \]

    Special cases result when either \(f(x) = 1\) or \(g(y) = 1\). In the first case the equation is said to be autonomous.

    The general solution to equation (1.5) is obtained in terms of two integrals:

    \[\int \dfrac{d y}{g(y)}=\int f(x) d x+C \label{1.6} \]

    where C is an integration constant. This yields a 1-parameter family of solutions to the differential equation corresponding to different values of \(C\). If one can solve (1.6) for \(y(x)\), then one obtains an explicit solution. Otherwise, one has a family of implicit solutions. If an initial condition is given as well, then one might be able to find a member of the family that satisfies this condition, which is often called a particular solution.

    Example 1.1. \(y' = 2xy, y(0) = 2\).

    Applying (1.6), one has

    \[\int \dfrac{dy}{y} = \int 2x dx + C. \nonumber \]

    Integrating yields

    \[\ln |y| = x^2 + C. \nonumber \]

    Exponentiating, one obtains the general solution,

    \[y(x)=\pm e^{x^{2}+C}=A e^{x^{2}} \nonumber \]

    Here we have defined \(A=\pm e^{C}\). Since \(C\) is an arbitrary constant, \(A\) is an arbitrary constant. Several solutions in this 1-parameter family are shown in Figure 1.1.

    Next, one seeks a particular solution satisfying the initial condition. For \(y(0)=2\), one finds that \(A=2\). So, the particular solution satisfying the initial conditions is \(y(x)=2 e^{x^{2}}\).

    Example 1.2. \(yy' = -x\)

    Following the same procedure as in the last example, one obtains:

    \[\int y d y=-\int x d x+C \Rightarrow y^{2}=-x^{2}+A, \quad \text { where } \quad A=2 C \nonumber \]

    Thus, we obtain an implicit solution. Writing the solution as \(x^2+y^2=A\), we see that this is a family of circles for \(A > 0\) and the origin for \(A = 0\). Plots of some solutions in this family are shown in Figure 1.2.

    The second type of first order equation encountered is the linear first order differential equation in the form

    \[y'(x) + p(x)y(x) = q(x). \label{1.7} \]

    Screen Shot 2022-06-27 at 1.01.33 PM.png
    Figure 1.1. Plots of solutions from the 1-parameter family of solutions of Example 1.1 for several initial conditions.

    In this case one seeks an integrating factor, \(µ(x)\), which is a function that one can multiply through the equation making the left side a perfect derivative. Thus, obtaining,

    \[\dfrac{d}{dx}[\mu(x)y(x)] = \mu(x)q(x). \label{1.8} \]

    The integrating factor that works is \(\mu(x)=\exp \left(\int^{x} p(\xi) d \xi\right)\). One can show this by expanding the derivative in Equation (1.8),

    \[\mu(x) y^{\prime}(x)+\mu^{\prime}(x) y(x)=\mu(x) q(x), \label{1.9} \]

    and comparing this equation to the one obtained from multiplying (1.7) by \(\mu(x)\):

    \[\mu(x) y^{\prime}(x)+\mu(x) p(x) y(x)=\mu(x) q(x). \label{1.10} \]

    Note that these last two equations would be the same if

    \[\dfrac{d\mu(x)}{dx} = \mu(x)p(x). \nonumber \]

    This is a separable first order equation whose solution is the above given form for the integrating factor,

    Screen Shot 2022-06-27 at 1.08.53 PM.png
    Figure 1.2. Plots of solutions of Example 1.2 for several initial conditions

    \[\mu(x)=\exp \left(\int^{x} p(\xi) d \xi\right). \label{1.11} \]

    Equation (1.8) is easily integrated to obtain

    \[y(x)=\dfrac{1}{\mu(x)}\left[\int^{x} \mu(\xi) q(\xi) d \xi+C\right]. \label{1.12} \]

    Example 1.3. \(xy' + y = x\), \(x > 0, y(1) = 0\).

    One first notes that this is a linear first order differential equation. Solving for $y^{\prime}$, one can see that the original equation is not separable. However, it is not in the standard form. So, we first rewrite the equation as

    \[\dfrac{d y}{d x}+\dfrac{1}{x} y=1. \label{1.13} \]

    Noting that \(p(x)=\dfrac{1}{x}\), we determine the integrating factor

    \[\mu(x)=\exp \left[\int^{x} \dfrac{d \xi}{\xi}\right]=e^{\ln x}=x \nonumber \]

    Multiplying equation (1.13) by \(\mu(x) = x\), we actually get back the original equation! In this case we have found that \(xy′+y\) must have been the derivative of something to start. In fact, \((xy)′ = xy′ + x\). Therefore, equation (1.8) becomes

    \[(xy)' = x. \nonumber \]

    Integrating one obtains

    \(xy = \dfrac{1}{2}x^2 + C\),

    or

    \[y(x) = \dfrac{1}{2}x + \dfrac{C}{x}. \nonumber \]

    Inserting the initial condition into this solution, we have \(0 = \dfrac{1}{2} + C\). Therefore, \(C = -\dfrac{1}{2}\). Thus, the solution of the initial value problem is \(y(x) = \dfrac{1}{2}(x - \dfrac{1}{x})\).

    Example 1.4. \(\sin x)y' + (\cos x)y = \(x^2\) \sin x\).

    Actually, this problem is easy if you realize that

    \[\dfrac{d}{dx}((\text{sin } x)y) = (\text{sin } x)y' + (\text{cos }x)y. \nonumber \]

    But, we will go through the process of finding the integrating factor for practice.

    First, rewrite the original differential eqution in standard form:

    \[y' + (\cot x)y = x^2. \nonumber \]

    Then, compute the integrating factor as

    \[\mu(x)=\exp \left(\int^{x} \cot \xi d \xi\right)=e^{-\ln (\sin x)}=\dfrac{1}{\sin x}. \nonumber \]

    Using the integrating factor, the original equation becomes

    \[\dfrac{d}{d x}((\sin x) y)=x^{2}. \nonumber \]

    Integrating, we have

    \[y \sin x=\dfrac{1}{3} x^{3}+C. \nonumber \]

    So, the solution is

    \[y=\left(\dfrac{1}{3} x^{3}+C\right) \csc x. \nonumber \]

    There are other first order equations that one can solve for closed form solutions. However, many equations are not solvable, or one is simply interested in the behavior of solutions. In such cases one turns to direction fields. We will return to a discussion of the qualitative behavior of differential equations later in the course.

    1.1.2 Second Order Linear Differential Equations

    Second order differential equations are typically harder than first order. In most cases students are only exposed to second order linear differential equations. A general form for a second order linear differential equation is given by

    \[a(x)y''(x)+b(x)y'(x)+c(x)y(x) = f(x). \label{1.14} \]

    One can rewrite this equation using operator terminology. Namely, one first defines the differential operator \(L = a(x)D^2 + b(x)D + c(x)\), where \(D = \dfrac{d}{dx}\). Then equation (1.14) becomes

    \[Ly = f \label{1.15} \]

    The solutions of linear differential equations are found by making use of the linearity of \(L\). Namely, we consider the vector space consisting of real-valued functions over some domain. Let \(f\) and \(g\) be vectors in this function space. \(L\) is a linear operator if for two vectors \(f\) and \(g\) and scalar \(a\), we have that

    a. \(L(f+g) = Lf + Lg\)

    b. \(L(af) = aLf\)

    One typically solves (1.14) by finding the general solution of the homogeneous problem,

    \[Ly_h = 0 \nonumber \]

    and a particular solution of the nonhomogeneous problem,

    \[Ly_p = f. \nonumber \]

    Then the general solution of (1.14) is simply given as \(y = y_h + y_p\). This is true because of the linearity of \(L\). Namely,

    \[\begin{aligned}
    L_y &= L(y_h + y_p) \\
    &= Ly_h + Ly_p \\
    &= 0 + f = f
    \end{aligned} \label{1.16} \]

    There are methods for finding a particular solution of a differential equation. These range from pure guessing to the Method of Undetermined Coefficients, or by making use of the Method of Variation of Parameters. We will review some of these methods later.

    Determining solutions to the homogeneous problem, \(Ly_h = 0\), is not always easy. However, others have studied a variety of second order linear equations and have saved us the trouble for some of the differential equations that often appear in applications.

    Again, linearity is useful in producing the general solution of a homogeneous linear differential equation. If \(y_1\) and \(y_2\) are solutions of the homogeneous equation, then the linear combination \(y = c_1y_1 + c_2y_2\) is also a solution of the homogeneous equation. In fact, if \(y_1\) and \(y_2\) are linearly independent, then \(y = c_1y_1 + c_2y_2\) is the general solution of the homogeneous problem. As you may recall, linear independence is established if the Wronskian of the solutions in not zero. In this case, we have

    \[W(y_1,y_2) = y_1(x)y'_2(x) - y'_1(x)y_2(x) \neq 0. \label{1.17} \]

    1.1.3 Constant Coefficient Equations

    The simplest and most seen second order differential equations are those with constant coefficients. The general form for a homogeneous constant coefficient second order linear differential equation is given as

    \[ay''(x) + by'(x) + cy(x) = 0 \label{1.18} \]

    where \(a,b\), and \(c\) are constants.

    Solutions to (1.18) are obtained by making a guess of \(y(x) = e^{rx}\). Inserting this guess into (1.18) leads to the characteristic equation

    \[ar^2 + br + c = 0 \label{1.19} \]

    The roots of this equation in turn lead to three types of solution depending upon the nature of the roots as shown below.

    Example 1.5. \(y'' - y' - 6y = 0\); \(y(0) = 2, y'(0) = 0\).

    The characteristic equation for this problem is \(r^2 - r - 6 = 0\). The roots of this equation are found as \(r = −2, 3\). Therefore, the general solution can be quickly written down:

    \[y(x) = c_1e^{-2x} + c_2e^{3x}. \nonumber \]

    Note that there are two arbitrary constants in the general solution. Therefore, one needs two pieces of information to find a particular solution. Of course, we have the needed information in the form of the initial conditions.

    One also needs to evaluate the first derivative

    \[y'(x) = -2c_1e^{-2x} + 3c_2e^{3x} \nonumber \]

    in order to attempt to satisfy the initial conditions. Evaluating \(y\) and \(y′\) at \(x = 0\) yields

    \[\begin{aligned}
    2 &= c_1 + c_2 \\
    0 &= -2c_1 + 3c_2
    \end{aligned} \label{1.20} \]

    These two equations in two unknowns can readily be solved to give \(c_1 = 6/5\) and \(c_2 = 4/5\). Therefore, the solution of the initial value problem is obtained as \(y(x) = \dfrac{6}{5}e^{-2x} + \dfrac{4}{5}e^{3x}\).

    Classification of Roots of the Characteristic Equation for Second Order Constant Coefficient ODEs
    1. Real, distinct roots \(r_1, r_2\). In this case the solutions corresponding to each root are linearly independent. Therefore, the general solution is simply \(y(x) = c_1e^{r_1x} + c_2e^{r_2x}\).
    2. Real, equal roots \(r_1 = r_2 = r\). In this case the solutions corresponding to each root are linearly dependent. To find a second linearly independent solution, one uses the Method of Reduction of Order. This gives the second solution as \(xe^{rx}\). Therefore, the general solution is found as \(y(x) = (c_1 + c_2 x)e^{rx}\). [This is covered in the appendix to this chapter].
    3. Complex conjugate roots \(r_1, r_2 = \alpha \pm i \beta\). In this case the solutions corresponding to each root are linearly independent. Making use of Euler's identity, \(e^{i \theta} = \cos(\theta) + i \sin(\theta)\), these complex exponentials can be rewritten in terms of trigonometric functions. Namely, one has that \(e^{\alpha x} \cos(\beta x)\) and \(e^{\alpha x} \sin(\beta x)\) are two linearly independent solutions. Therefore, the general solution becomes \(y(x) = e^{\alpha x}(c_1 \cos(\beta x) + c_2 \sin(\beta x))\). [This is covered in the appendix to this chapter.]
    Example 1.6. \(y'' + 6y' + 9y = 0\).

    In this example we have \(r^2 + 6r + 9 = 0\). There is only one root, \(r=-3\). Again, the solution is easily obtained as \(y(x) = (c_1 + c_2 x)e^{-3x}\).

    Example 1.7. \(y'' + 4y = 0\).

    The characteristic equation in this case is \(r^2 + 4 = 0\). The roots are pure imaginary roots, \(r \pm 2i\) and the general solution consists purely of sinusoidal functions: \(y(x) = c_1 \cos(2x) + c_2 \sin(2x)\).

    Example 1.8. \(y'' + 2y' + 4y = 0\).

    The characteristic equation in this case is \(r^2 + 2r + 4 = 0\). The roots are complex, \(r = -1 \pm \sqrt{3}i \) and the general solution can be written as \(y(x) = [c_1 \cos(\sqrt{3}x) + c_2 \sin(\sqrt{3}x)] e^{-x}\).

    One of the most important applications of the equations in the last two examples is in the study of oscillations. Typical systems are a mass on a spring, or a simple pendulum. For a mass \(m\) on a spring with spring constant \(k > 0\), one has from Hooke’s law that the position as a function of time, \(x(t)\), satisfies the equation

    \[mx'' + kx = 0. \nonumber \]

    This constant coefficient equation has pure imaginary roots (\(\alpha = 0\)) and the solutions are pure sines and cosines. Such motion is called simple harmonic motion.

    Adding a damping term and periodic forcing complicates the dynamics, but is nonetheless solvable. The next example shows a forced harmonic oscillator.

    Example 1.9. \(y'' + 4y = \sin x\).

    This is an example of a nonhomogeneous problem. The homogeneous problem was actually solved in Example 1.7. According to the theory, we need only seek a particular solution to the nonhomogeneous problem and add it to the solution of the last example to get the general solution.

    The particular solution can be obtained by purely guessing, making an educated guess, or using the Method of Variation of Parameters. We will not review all of these techniques at this time. Due to the simple form of the driving term, we will make an intelligent guess of \(y_p(x) = A \sin x\) and determine what \(A\) needs to be. Recall, this is the Method of Undetermined Coefficients which we review in the next section. Inserting our guess in the equation gives \((−A + 4A) \sin x = \sin x\). So, we see that \(A = 1/3\) works. The general solution of the nonhomogeneous problem is therefore \(y(x) = c_1 \cos(2x) + c_2 \sin(2x) + \dfrac{1}{3} \sin x\).

    1.1.4 Method of Undetermined Coefficients

    To date, we only know how to solve constant coefficient, homogeneous equations. How does one solve a nonhomogeneous equation like that in Equation (1.14),

    \[a(x)y''(x) + b(x)y'(x) + c(x)y(x) = f(x) \label{1.21} \]

    Recall, that one solves this equation by finding the general solution of the homogeneous problem,

    \[Ly_h = 0 \nonumber \]

    and a particular solution of the nonhomogeneous problem,

    \[Ly_p = f. \nonumber \]

    Then the general solution of (1.14) is simply given as \(y = y_h + y_p\). So, how do we find the particular solution?

    You could guess a solution, but that is not usually possible without a little bit of experience. So we need some other methods. There are two main methods. In the first case, the Method of Undetermined Coefficients, one makes an intelligent guess based on the form of \(f(x)\). In the second method, one can systematically develop the particular solution. We will come back to this method the Method of Variation of Parameters, later in the book.

    Let’s solve a simple differential equation highlighting how we can handle nonhomogeneous equations.

    Example 1.10. Consider the equation

    \[y'' + 2y' - 3y = 4 \label{1.22} \]


    The first step is to determine the solution of the homogeneous equation.

    Thus, we solve

    \[y_h^{''} + 2y_h^{'} - 3y_h = 0 \label{1.23} \]

    The characteristic equation is \(r^2 + 2r - 3 = 0\). The roots are \(r = 1, -3\). So, we can immediately write the solution

    \[y_h(x) = c_1e^x + c_2e^{-3x}. \nonumber \]

    The second step is to find a particular solution of (1.22). What possible function can we insert into this equation such that only a 4 remains? If we try something proportional to \(x\), then we are left with a linear function after inserting \(x\) and its derivatives. Perhaps a constant function you might think. \(y = 4\) does not work. But, we could try an arbitrary constant, \(y = A\).

    Let's see. Inserting \(y = A\) into (1.22), we obtain

    \[-3A = 4. \nonumber \]

    Ah ha! We see that we can choose \(A = -\dfrac{4}{3}\) and this works. So, we have a particular solution \(y_p(x) = -\dfrac{4}{3}\). This step is done.

    Combining our two solutions, we have the general solution to the original nonhomogeneous equation (1.22). Namely,

    \[y(x) = y_h(x) + y_p(x) = c_1e^x + c_2e^{-3x} - \dfrac{4}{3}. \nonumber \]

    Insert this solution into the equation and verify that it is indeed a solution. If we had been given initial conditions, we could now use them to determine our arbitrary constants.

    What if we had a different source term? Consider the equation

    \[y'' +2y' - 3y = 4x \label{1.24} \]

    The only thing that would change is our particular solution. So, we need a guess.

    We know a constant function does not work by the last example. So, let’s try \(y_p = Ax\). Inserting this function into Equation (??), we obtain

    \[2A - 3Ax = 4x. \nonumber \]

    Picking \(A = -4/3\) would get rid of the \(x\) terms, but will not cancel everything. We still have a constant left. So, we need something more general.

    Let's try a linear function, \(y_p(x) = Ax + B\). Then we get after substitution into (1.24)

    \[2A - 3(Ax + B) = 4x. \nonumber \]

    Equating the coefficients of the different powers of \(x\) on both sides, we find a system of equations for the undetermined coefficients:

    \[\begin{aligned}
    2A - 3B &= 0 \\
    -3A &= 4
    \end{aligned} \label{1.25} \]

    These are easily solved to obtain

    \[\begin{aligned}
    A &= -\dfrac{4}{3} \\
    B &= \dfrac{2}{3}A = -\dfrac{8}{9}.
    \end{aligned} \label{1.26} \]

    So, our particular solution is

    \[y_p(x) = -\dfrac{4}{3}x - \dfrac{8}{9}. \nonumber \]

    This gives the general solution to the nonhomogeneous problem as

    \[y(x) = y_h(x) + y_p(x) = c_1e^x + c_2e^{-3x} - \dfrac{4}{3}x - \dfrac{8}{9}. \nonumber \]

    There are general forms that you can guess based upon the form of the driving term, \(f(x)\). Some examples are given in Table 1.1.4. More general applications are covered in a standard text on differential equations. However, the procedure is simple. Given \(f(x)\) in a particular form, you make an appropriate guess up to some unknown parameters, or coefficients. Inserting the guess leads to a system of equations for the unknown coefficients. Solve the system and you have your solution. This solution is then added to the general solution of the homogeneous differential equation.

    \(f(x)\) Guess

    \(a_nx^n + a_{n-1}x^{n-1} + \cdots + a_1x + a_0\)

    \(ae^{bx}\)

    \(a \cos wx = b \sin wx\)

    \(A_nx^n + A_{n-1}x^{n-1} + \cdots + A_1x + A_0\)

    \(Ae^{bx}\)

    \(A \cos wx + B \sin wx\)

    Example 1.11. As a final example, let's consider the equation

    \[y'' + 2y' - 3y = 2e^{-3x}. \label{1.27} \]

    According to the above, we would guess a solution of the form \(y_p = Ae^{-3x}\).

    Inserting our guess, we find

    \[0 = 2e^{-3x}. \nonumber \]

    Oops! The coefficient, \(A\), disappeared! We cannot solve for it. What went wrong?

    The answer lies in the general solution of the homogeneous problem. Note that \(e^x\) and \(e^{-3x}\) are solutions to the homogeneous problem. So, a multiple of \(e^{-3x}\) will not get us anywhere. It turns out that there is one further modification of the method. If our driving term contains terms that are solutions of the homogeneous problem, then we need to make a guess consisting of the smallest possible power of \(x\) times the function which is no longer a solution of the homogeneous problem. Namely, we guess \(y_p(x) = Axe^{-3x}\). We compute the derivative of our guess, \(y'_p = A(1-3x)e^{-3x}\) and \(y_p^{''} = A(9x - 6)e^{-3x}\). Inserting these into the equation, we obtain

    \([(9x - 6) + 2(1 - 3x) - 3x]Ae^{-3x} = 2e^{-3x}\),

    or

    \[-4A = 2. \nonumber \]

    So, \(A = -1/2\) and \(y_p(x) = -\dfrac{1}{2}xe^{-3x}\).

    Modified Method of Undetermined Coefficients

    In general, if any term in the guess \(y_p(x)\) is a solution of the homogeneous equation, then multiply the guess by \(x^k\), where \(k\) is the smallest positive integer such that no term in \(x^ky_p(x)\) is a solution of the homogeneous problem.

    1.1.5 Cauchy-Euler Equations

    Another class of solvable linear differential equations that is of interest are the Cauchy-Euler type of equations. These are given by

    \[ax^2y''(x) + bxy'(x) + cy(x) = 0. \label{1.28} \]

    Note that in such equations the power of \(x\) in each of the coefficients matches the order of the derivative in that term. These equations are solved in a manner similar to the constant coefficient equations.

    One begins by making the guess \(y(x) = x^r\). Inserting this function and its derivatives,

    \(y'(x) = rx^{r-1}, \quad y''(x) = r(r-1)x^{r-2}\),

    into Equation (1.28), we have

    \[[ar(r-1) + br + c]x^r = 0. \nonumber \]

    Since this has to be true for all \(x\) in the problem domain, we obtain the characteristic equation

    \[ar(r-1) + br + c = 0. \label{1.29} \]

    Just like the constant coefficient differential equation, we have a quadratic equation and the nature of the roots again leads to three classes of solutions. These are shown below. Some of the details are provided in the next section.

    Classification of Roots of the Characteristic Equation for Cauchy-Euler Differential Equations
    1. Real, distinct roots \(r_1, r_2\). In this case the solutions corresponding to each root are linearly independent. Therefore, the general solution is simply \(y(x) = c_1x^{r_1} + c_2x^{r_2}\).
    2. Real, equal roots \(r_1 = r_2 = r\). In this case the solutions corresponding to each root are linearly dependent. To find a second linearly independent solution, one uses the Method of Reduction of Order. This gives the second solution as \(x^r \ln|x|\). Therefore, the general solution is found as \(y(x) = (c_1 + c_2 \ln|x|)x^r\).
    3. Complex conjugate roots \(r_1,r_2 = \alpha \pm i \beta\). In this case the solutions corresponding to each root are linearly independent. These complex exponentials can be rewritten in terms of trigonometric functions. Namely, one has that \(x^{\alpha} \cos(\beta \ln|x|)\) and \(x^{\alpha} \sin(\beta \ln|x|)\) are two linearly independent solutions. Therefore, the general solution becomes \(y(x) = x^{\alpha}(c_1 \cos(\beta \ln|x|) + c_2 \sin(\beta \ln|x|))\).
    Example 1.12. \(x^2y'' + 5xy' + 12y = 0\)

    As with the constant coefficient equations, we begin by writing down the charcateristic equation. Doing a simple computation,

    \[\begin{aligned}
    0 &= r(r-1) + 5r + 12 \\
    &= r^2 + 4r + 12 \\
    &= (r+2)^2 + 8, \\
    -8 &= (r+2)^2, \end{aligned} \label{1.30} \]

    one determines the roots are \(r = -2 \pm 2 \sqrt{2}i\). Therefore, the general solution is \(y(x) = [c_1 \cos(2 \sqrt{2} \ln|x|) + c_2 \sin(2\sqrt{2} \ln|x|)]x^{-2}\)

    Example 1.13. \(t^2y'' + 2ty' + y = 0, \quad y(1) = 0, y'(1) = 1\).

    For this example the characteristic equation takes the form

    \(r(r-1)+3 r+1=0\),

    or

    \[r^{2}+2 r+1=0. \nonumber \]

    There is only one real root, \(r=-1\). Therefore, the general solution is

    \[y(t)=\left(c_{1}+c_{2} \ln |t|\right) t^{-1} \nonumber \]

    However, this problem is an initial value problem. At \(t=1\) we know the values of \(y\) and \(y^{\prime}\). Using the general solution, we first have that

    \[0=y(1)=c_{1} \nonumber \]

    Thus, we have so far that \(y(t)=c_{2} \ln |t| t^{-1}\). Now, using the second condition and

    \[y^{\prime}(t)=c_{2}(1-\ln |t|) t^{-2} \nonumber \]

    we have

    \[1=y(1)=c_{2}. \nonumber \]

    Therefore, the solution of the initial value problem is \(y(t)=\ln |t| t^{-1}\).

    Nonhomogeneous Cauchy-Euler Equations

    We can also solve some nonhomogeneous Cauchy-Euler equations using the Method of Undetermined Coefficients. We will demonstrate this with a couple of examples.

    Example 1.14. Find the solution \(x^2y'' - xy' - 3y = 2x^2\).

    First we find the solution of the homogeneous equation. The characteristic equation is \(r^{2}-2 r-3=0\). So, the roots are \(r=-1,3\) and the solution is \(y_{h}(x)=c_{1} x^{-1}+c_{2} x^{3}\).

    We next need a particular solution. Let's guess \(y_{p}(x)=A x^{2}\). Inserting the guess into the nonhomogeneous differential equation, we have

    \[\begin{aligned}
    2 x^{2} &=x^{2} y^{\prime \prime}-x y^{\prime}-3 y=2 x^{2} \\
    &=2 A x^{2}-2 A x^{2}-3 A x^{2} \\
    &=-3 A x^{2}
    \end{aligned} \label{1.31} \]

    So, \(A=-2 / 3\). Therefore, the general solution of the problem is

    \[y(x)=c_{1} x^{-1}+c_{2} x^{3}-\dfrac{2}{3} x^{2}. \nonumber \]

    Example 1.15. Find the solution of \(x^2y'' - xy' - 3y = 2x^3\).

    In this case the nonhomogeneous term is a solution of the homogeneous problem, which we solved in the last example. So, we will need a modification of the method. We have a problem of the form

    \[a x^{2} y^{\prime \prime}+b x y^{\prime}+c y=d x^{r} \nonumber \]

    where \(r\) is a solution of \(a r(r-1)+b r+c=0\). Let's guess a solution of the form \(y=A x^{r} \ln x\). Then one finds that the differential equation reduces to \(A x^{r}(2 a r-a+b)=d x^{r}\). [You should verify this for yourself.]

    With this in mind, we can now solve the problem at hand. Let \(y_{p}= A x^{3} \ln x\). Inserting into the equation, we obtain \(4 A x^{3}=2 x^{3}\), or \(A=1 / 2\). The general solution of the problem can now be written as

    \[y(x)=c_{1} x^{-1}+c_{2} x^{3}+\dfrac{1}{2} x^{3} \ln x \nonumber \]


    This page titled 1.1: 1.1 Review of the First Course is shared under a CC BY-NC-SA 3.0 license and was authored, remixed, and/or curated by Russell Herman via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.