Skip to main content
\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)
Mathematics LibreTexts

9.0: Frobenius’ Method

  • Page ID
    8334
  • [ "article:topic", "method of Frobenius", "authorname:nwalet", "license:ccbyncsa" ]

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    Series solutions of O.D.E.
    (Frobenius’ method)

    Let us look at the a very simple (ordinary) differential equation, \[y''(t) = t\,y(t),\] with initial conditions \(y(0) = a\), \(y'(0)=b\). Let us assume that there is a solution that is analytical near \(t=0\). This means that near \(t=0\) the function has a Taylor’s series

    \[y(t) = c_0 + c_1 t + \ldots = \sum_{k=0}^\infty c_k t^k.\]

    (We shall ignore questions of convergence.) Let us proceed

    \[\begin{align} {4} y'(t) &= c_1 + 2c_2 t +\ldots &= \sum_{k=1}^\infty k c_k t^{k-1}, \nonumber\\ y''(t) &= 2c_2+3\cdot 2 t +\ldots &= \sum_{k=2}^\infty k(k-1) c_k t^{k-2}, \nonumber\\ t y(t) &= c_0t + c_1 t^2 + \ldots &= \sum_{k=0}^\infty c_k t^{k+1}.\end{align}\]

    ombining this together we have \[\begin{align} y''-ty &= [2c_2+3\cdot 2 t +\ldots] - [c_0t + c_1 t^2 + \ldots] \nonumber\\ &= 2c_2+(3\cdot2 c_3-c_0)t+\ldots\nonumber\\ &= 2c_2+\sum_{k=3}^\infty\left\{k(k-1)c_k-c_{k-3}\right\}t^{k-2}.\end{align}\]

    Here we have collected terms of equal power of \(t\). The reason is simple. We are requiring a power series to equal \(0\). The only way that can work is if each power of \(x\) in the power series has zero coefficient. (Compare a finite polynomial....) We thus find \[c_2=0,\;\;k(k-1) c_k = c_{k-3}.\] The last relation is called a recurrence of recursion relation, which we can use to bootstrap from a given value, in this case \(c_0\) and \(c_1\). Once we know these two numbers, we can determine \(c_3\),\(c_4\) and \(c_5\):

    \[c_3= \frac{1}{6}c_0,\;\;\;c_4= \frac{1}{12}c_1,\;\;\;c_5=\frac{1}{20}c_2=0.\]

    These in turn can be used to determine \(c_6,c_7,c_8\), etc. It is not too hard to find an explicit expression for the \(c\)’s

    \[\begin{align} c_{3m} &= \frac{3m-2}{(3m)(3m-1)(3m-2)} c_{3(m-1)} \nonumber\\ &= \frac{3m-2}{(3m)(3m-1)(3m-2)} \frac{3m-5}{(3m-3)(3m-4)(3m-5)} c_{3(m-1)} \nonumber\\ &= \frac{(3m-2)(3m-5)\ldots 1}{(3m)!} c_0, \nonumber\\ c_{3m+1} &= \frac{3m-1}{(3m+1)(3m)(3m-1)} c_{3(m-1)+1} \nonumber\\ &= \frac{3m-1}{(3m+1)(3m)(3m-1)} \frac{3m-4}{(3m-2)(3m-3)(3m-4)} c_{3(m-2)+1} \nonumber\\ &= \frac{(3m-2)(3m-5)\ldots 2}{(3m+1)!} c_1, \nonumber\\ c_{3m+1} &= 0.\end{align}\]

    The general solution is thus

    \[y(t) = a \left[1+\sum_{m=1}^\infty c_{3m}t^{3m}\right] + b \left[1+\sum_{m=1}^\infty c_{3m+1}t^{3m+1}\right] .\]

    The technique sketched here can be proven to work for any differential equation \[y''(t)+p(t)y'(t)+q(t)y(t)=f(t)\] provided that \(p(t)\), \(q(t)\) and \(f(t)\) are analytic at \(t=0\). Thus if \(p\), \(q\) and \(f\) have a power series expansion, so has \(y\).

    Singular points

    As usual there is a snag. Most equations of interest are of a form where \(p\) and/or \(q\) are singular at the point \(t_0\) (usually \(t_0=0\)). Any point \(t_0\) where \(p(t)\) and \(q(t)\) are singular is called (surprise!) a singular point. Of most interest are a special class of singular points called regular singular points, where the differential equation can be given as \[(t-t_0)^2 y''(t) + (t-t_0) \alpha(t) y'(t) + \beta(t)y(t) = 0,\] with \(\alpha\) and \(\beta\) analytic at \(t=t_0\). Let us assume that this point is \(t_0=0\). Frobenius’ method consists of the following technique: In the equation \[x^2 y''(x) + x \alpha(x) y'(x) + \beta(x)y(x) = 0,\] we assume a generalised series solution of the form \[y(x)=x^\gamma \sum_{n=0}^\infty c_n x^k .\] Equating powers of \(x\) we find \[\gamma(\gamma-1) c_0 x^\gamma + \alpha_0 \gamma c_0 x^\gamma + \beta_0c_0 x^\gamma = 0,\] etc. The equation for the lowest power of \(x\) can be rewritten as \[\gamma(\gamma-1) + \alpha_0\gamma + \beta_0 = 0.\] . It is a quadratic equation in \(\gamma\), that usually has two (complex) roots. Let me call these \(\gamma_1\), \(\gamma_2\). If \(\gamma_1-\gamma_2\) is not integer one can prove that the two series solutions for \(y\) with these two values of \(\gamma\) are independent solutions.

    Let us look at an example \[t^2 y''(t) + \frac{3}{2} t y'(t) + ty = 0.\] Here \(\alpha(t)=3/2\), \(\beta(t)=t\), so \(t=0\) is indeed a regular singular point. The indicial equation is \[\gamma(\gamma-1)+\frac{3}{2}\gamma = \gamma^2+\gamma/2 = 0.\] which has roots \(\gamma_1=0\), \(\gamma_2=-1/2\), which gives two independent solutions \[\begin{align} y_1(t)&= \sum_{k}c_kt^k,\nonumber\\ y_2(t)&= t^{-1/2}\sum_{k}d_kt^k.\nonumber\end{align}\]

    Independent solutions:
    Independent solutions are really very similar to independent vectors: Two or more functions are independent if none of them can be written as a combination of the others. Thus \(x\) and \(1\) are independent, and \(1+x\) and \(2+x\) are dependent.

    *Special cases

    For the two special cases I will just give the solution. It requires a substantial amount of algebra to study these two cases.

    Two equal roots

    If the indicial equation has two equal roots, \(\gamma_1=\gamma_2\), we have one solution of the form \[y_1(t) = t^{\gamma_1} \sum_{n=0}^\infty c_n t^n.\] The other solution takes the form \[y_2(t) = y_1(t)\ln t +t^{\gamma_1+1} \sum_{n=0}^\infty d_n t^n.\] Notice that this last solution is always singular at \(t=0\), whatever the value of \(\gamma_1\)!

    Two roots differing by an integer

    If the indicial equation that differ by an integer, \(\gamma_1-\gamma_2=n>0\), we have one solution of the form \[y_1(t) = t^{\gamma_1} \sum_{n=0}^\infty c_n t^n.\] The other solution takes the form \[y_2(t) = ay_1(t)\ln t +t^{\gamma_2} \sum_{n=0}^\infty d_n t^n.\] The constant \(a\) is determined by substitution, and in a few relevant cases is even \(0\), so that the solutions can be of the generalised series form.

    Example 1

    Find two independent solutions of \[t^2y''+ty'+ty=0\] near \(t=0\). The indicial equation is \(\gamma^2=0\), so we get one solution of the series form \[y_1(t) = \sum_n c_n t^n.\] We find \[\begin{align} t^2y''_1 &= \sum_n n(n-1) c_n t^n \nonumber\\ ty'_1 &= \sum_n n c_n t^n \nonumber\\ ty_1 &= \sum_nc_n t^{n+1} =\sum_{n'}c_{n'-1} t^{n'}\end{align}\] We add terms of equal power in \(x\), \[\begin{array}{rclclclclcl} t^2y''_1 &= 0&+& 0 t&+&2c_2t^2&+&6c_3t^3&+&\ldots\\ ty'_1 &= 0&+& c_1t&+&2c_2t^2&+&3c_3t^3&+&\ldots\\ ty_1 &= 0&+& c_0t&+&c_1t^2&+&c_2t^3&+&\ldots\\ \hline t^2y''+ty'+ty&= 0&+&(c_1+c_0)t&+&(4c_2+c_1)t^2&+&(9c_3+c_2)t^2&+&\ldots \end{array}\] Both of these ways give \[t^2y''+ty'+ty=\sum_{n=1}^\infty (c_n n^2+c_{n-1})t^n,\] and lead to the recurrence relation \[c_n = -\frac{1}{n^2} c_{n-1}\] which has the solution \[c_n = (-1)^n \frac{1}{n!^2}\] and thus \[y_1(t) = \sum_{n=0}^\infty (-1)^n \frac{1}{n!^2} x^n\] Let us look at the second solution \[y_2(t) =\ln(t) y_1(t)+\underbrace{t\sum_{n=0}^\infty d_n t^n}_{y_3(t)}\] Her I replace the power series with a symbol, \(y_3\) for convenience. We find \[\begin{align} y_2' &= \ln(t) y_1' + \frac{y_1(t)}{t}+y_3'\nonumber\\ y_2'' &= \ln(t) y_1'' + \frac{2y'_1(t)}{t}-\frac{y_1(t)}{t^2}+ +y_3''\end{align}\] Taking all this together, we have, \[\begin{align} t^2y_2''+ty_2'+ty_2 &= \ln(t)\left(t^2 {y_1}''+t {y_1}'+t{y_1}\right) -y_1+2ty'_1+y_1 + t^2 {y_3}''+t {y_3}'+y_3 \nonumber\\ &= 2t{y_1}'+t^2{y_3}''+t{y_3}'+ty_3=0.\end{align}\] If we now substitute the series expansions for \(y_1\) and \(y_3\) we get \[2c_n+d_n(n+1)^2+d_{n-1}=0,\] which can be manipulated to the form

    Here there is some missing material

    Example 2

    Find two independent solutions of \[t^2{y'}'+t^2{y}'-ty=0\] near \(t=0\).

    The indicial equation is \(\alpha(\alpha-1)=0\), so that we have two roots differing by an integer. The solution for \(\alpha=1\) is \(y_1=t\), as can be checked by substitution. The other solution should be found in the form \[y_2(t) = at\ln t + \sum_{k=0}d_k t^k\] We find \[\begin{align} y_2' & = & a+a\ln t + \sum_{k=0}kd_k t^{k-1} \nonumber \\ y_2'' & = & a/t + \sum_{k=0}k(k-1)d_k t^{k-2} \nonumber \\\end{align}\] We thus find \[\begin{align} t^2y''_2+t^2y'_2-ty_2= a(t+t^2)+ \sum_{k=q}^\infty \left[d_k k(k-1)+d_{k-1}(k-2)\right] t^k\end{align}\] We find \[d_0 = a,\;\;\;2 d_2+a=0,\;\;\;d_k = (k-2)/(k(k-1))d_{k-1}\;\;(k>2)\] On fixing \(d_0=1\) we find \[y_2(t) = 1 + t \ln t + \sum_{k=2}^\infty \frac{1}{(k-1)!k!}(-1)^{k+1} t^k\]