Skip to main content
\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)
Mathematics LibreTexts

Taylor Expansion II

In this second chapter on Taylor Series, we will be studying the case where the n.th derivative of an infinitely differentiable function, does not go to zero. In such cases we therefore have to restrict our values of \(x\) such that for these values the series does converge and not diverge. We will also take a look at the series for the Exponential and Sine function.

Let us begin with a look once again at the series for the infinitely differentiable function, the square root of \(x\). The theories that we are to develop will hold true for any function raised to a positive or negative fractional exponent.

\[ \sqrt{x}\approx 1+\dfrac{(x-1)}{2}-\dfrac{(x-1)^2}{8}+\dfrac{(x-1)^3}{16}-\dfrac{5(x-1)^4}{128}+\dfrac{7(x-1)^5}{256}-\dfrac{21(x-1)^6}{1024}+\dfrac{231(x-1)^7}{14336} \]

Remember we let \(a=1\) to find the square root of any number \(x\). Take a close look at the fractions preceding each term:

\[\dfrac{1}{2},\dfrac{1}{8},\dfrac{1}{16},\dots,\dfrac{21}{1024},\dfrac{231}{14336} . \]

It seems that they are slowly going to zero, but now look at the terms of the Taylor Series:

\[(x-1)^1,(x-1)^2,(x-1)^3,\dots,(x-1)^n . \]

Since each term represents a power of some number \((x-1)\), then we can conclude that if this number is greater than 1, then each term in the series will get larger and larger and approach infinity, in the infinite term.

However if the absolute value of \((x-1)\) were less than 1, then as the number of terms in the series increased, the value of each term will decrease. Therefore we can say that the following series is only valid for:

\[\begin{align} &|x-1|<1 \\ \iff & -1<x-1<1 \\ \iff & 0<x<2 .\end{align}\]

This tells us that for \(x\) in this range of values, the infinite term will go to zero and the series will converge. Remember, the more times you multiply a fraction by itself, the smaller it becomes.

Now that we have defined the series for a certain range of numbers then we can extend this range to include all the real numbers. For example we can find the square root of any fraction \(\frac{1}{x}\) where \(1<x<\infty\). This means we can find the square root of any number, greater than 2, by taking the reciprocal of its square root. If we had to find \(\sqrt{47}\), then we just rewrite it as \(\dfrac{1}{\frac{1}{\sqrt{47}}}\). Since \(0<\frac{1}{47}<2\) then we just find the square root of \(\frac{1}{47}\) by letting \(x=\frac{1}{47} \), then taking the reciprocal of this value.

Now let us move on to finding Taylor Series for the exponential and Complex Sine Function. The Taylor series for \(y=e^x\) can be easily found since its \(n\) derivatives are all the same, \(e^x\). The series is then:

\[f(x)=e^x=e^a+e^a(x-a)+e^a\dfrac{(x-a)^2}{2!}+e^a\dfrac{(x-a)^3}{3!}+e^a\dfrac{(x-a)^4}{4!}+\dots .  \]

The easiest value to choose for \(a\) is 0 since \(e^0=1\)

\[f(x)=e^x=1+x+\dfrac{x^2}{2!}+\dfrac{x^3}{3!}+\dfrac{x^4}{4!}+\dfrac{x^4}{4!}+\dfrac{x^5}{5!}+\dfrac{x^6}{6!}+\dfrac{x^7}{7!}+\dots \]

Since the limit as \(n\) goes to infinity of \(\frac{x^n}{n!}\) is zero, regardless of what value \(x\) is, the series is valid for any value of \(x\).
Letting \(x=1\) and using only the first eight terms gives us the value for \(e\):

\[f(x)=e=1+1+\dfrac{1}{2!}+\dfrac{1}{3!}+\dfrac{1}{4!}+\dfrac{1}{5!}+\dfrac{1}{6!}+\dfrac{1}{7!}+\dots \]

\[\implies e\approx 2.718253968 .\]

The calculator value for \(e\) is 2.718281828 which corresponds to an error of less than .001 % using only eight terms. The more terms used the more accurate your answer will be.

Now that we have found the series for \(y=e^x\) we can find the Taylor series for \(y=\ln (x)\), which is also an infinitely differentiable function. Fist let is find its \(n\) derivatives:

\[\begin{align} &f(x)=\ln(x), &f'(x)\dfrac{1}{x}, &&f''(x)=-\dfrac{1}{x^2}, &&f'''(x)=\dfrac{2!}{x^3} \\ &f^{(4)}(x)=-\dfrac{3!}{x^4}, &f^{(5)}=\dfrac{4!}{x^5}, &&f^{(6)}=-\dfrac{5!}{x^6}, &&f^{(7)}=\dfrac{6!}{x^7} . \end{align}\]

Using the first seven derivatives we write the following Taylor series:

\[\begin{align} f(x)=\ln(x) &=\ln(a)+\dfrac{1}{a}(x-a)-\dfrac{1}{a^2}\dfrac{(x-a)^2}{2!}+\dfrac{2!}{a^3}\dfrac{(x-a)^3}{3!}-\dfrac{3!}{a^4}\dfrac{(x-a)^4}{4!} \\ &+\dfrac{4!}{a^5}\dfrac{(x-a)^5}{5!}-\dfrac{5!}{a^6}\dfrac{(x-a)^6}{6!}+\dfrac{6!}{a^7}\dfrac{(x-a)^7}{7!}.  \end{align}\]

Letting \(a\) equal 1 and simplifying factorials:

\[f(x)=\ln(x)=(x-1)-\dfrac{(x-1)^2}{2}+\dfrac{(x-1)^3}{3}-\dfrac{(x-1)^4}{4}+\dfrac{(x-1)^5}{5}-\dfrac{(x-1)^6}{6}+\dfrac{(x-1)^7}{7}-\dots .\]

There is one small problem here. Though the Natural Log of \(x\) is defined for all values of \(x\) greater than zero, the Taylor series on the other hand is only valid for \(1<x<2\). If \(x\) were three for example, the series on the right would diverge as each term, \(\frac{2^n}{n}\) will get larger and larger.

In an infinitely differentiable function as in \(y=\sqrt{x}\), we assumed that the last term in:

\[f(x)=\sum_{k=0}^{n-1} f^{(k)}(a)\dfrac{(x-a)^k}{k!}+f^{(n)}(c)\dfrac{(x-a)^n}{n!} \]

\(f^{(n)}(c)\dfrac{(x-a)^n}{n!}\) went to zero as \(n\) went to infinity. This was based on the fact that

\[\lim_{n\rightarrow \infty}c\cdot \dfrac{x^n}{n!}=c\lim_{n\rightarrow \infty} \dfrac{x^n}{n!}=c\cdot 0 = 0 . \]

However in the case of the series for the natural log of \(x\), the last few terms become:

\[\dfrac{(n-1)!}{a^n}\cdot \dfrac{(x-a)^n}{n!}=\dfrac{1}{a^n}\cdot \dfrac{(x-a)^n}{n} .\]

Taking the limit as \(n\) goes to infinity:

\[\lim_{n\rightarrow \infty} \dfrac{1}{a^n}\cdot \dfrac{(x-a)^n}{n}= \lim_{n\rightarrow\infty} \dfrac{1}{n}\left( \dfrac{x-a}{n} \right)^n  .\]

The limit of this function clearly goes to infinity as long as the absolute value of \(\left|\dfrac{x-a}{a} \right|\) is greater than 1. Conversely if \(\left| \dfrac{x-a}{a} \right|\) is less than 1 or \(-a<x-a<a\) or \(0<x<2a\), then the last term will go to zero and the Taylor Series will hold true. By making the modification to the Taylor Series for \(\ln(x)\), the series will converge for \(0<x<2a\). Since \(a =1\) the series converges for:

\[\begin{align}\left|\dfrac{x-a}{a} \right|&<1 \\ |x-1|&<1 \\ 0<x&<2 . \end{align} \]

Hence the series:

\[f(x)=\ln(x)=(x-1)-\dfrac{(x-1)^2}{2}+\dfrac{(x-1)^3}{3}-\dfrac{(x-1)^4}{4}+\dfrac{(x-1)^5}{5}-\dfrac{(x-1)^6}{6}+\dfrac{(x-1)^7}{7}-\dots .\]

is valid for \(x\) between 0 and 2. Despite the restricted value of \(x\), we are still able to calculate \(\ln(x)\) for any \(x\), by just taking reciprocal values. If we wanted to find the \(\ln(40)\), \(40>2\) we would have to calculate \(\ln(40)^{-1}\) which equals

\[\begin{align} \ln \left(\dfrac{1}{40}\right)&=-\ln(40) \\ \therefore \ln(x)&= -\ln \left( \dfrac{1}{x} \right) \\ \text{for } &x>1  .\end{align}\]

The Taylor Series for \(\sin(x)\) and \(\cos(x)\) are also quite easy to find. Since we know the derivative of \(\sin(x)\) is \(\cos(x)\) and \(\cos(x)\) is \(-\sin(x)\) and we can evaluate these functions at \(a=0\),  as \(\sin(0)=0\) and \(\cos(0)=1\), the Taylor Series are as follows. First find the first few derivatives.

\[\begin{align} &f(x)=\sin(x), &f'(x)=\cos(x), &&f''(x)=-\sin(x), &&f'''(x)=-\cos(x) \\ &f^{(4)}(x)=\sin(x), &f^{(5)}(x)=\cos(x), &&f^{(6)}(x)=-\sin(x), &&f^{(7)}(x)=-\cos(x) .\end{align}\]

Using this to write the Taylor Series for the first eight terms is:

\[\begin{align} &f(x)=\sin(x)= \sin(a)+\cos(a)(x-a)-\sin(a)\dfrac{(x-a)^2}{2!} -\cos(a)\dfrac{(x-a)^3}{3!}  +\sin(a)\dfrac{(x-a)^4}{4!} \\ &+\cos(a)\dfrac{(x-a)^5}{5!} -\sin(a)\dfrac{(x-a)^6}{6!}-\cos(a)\dfrac{(x-a)^7}{7!}. \end{align}\]

Letting \(a=1\),

\[\begin{align} f(x)&=\sin(x)=0+x-0-1\cdot \dfrac{x^3}{3!}+0+1\cdot \dfrac{x^5}{5!}-0 -1\cdot \dfrac{x^7}{7!}+0 +\dots \\ &\sin(x)=x-\dfrac{x^3}{3!}+\dfrac{x^5}{5!}-\dfrac{x^7}{7!}+\dfrac{x^9}{9!}-\dfrac{x^11}{11!}+\dots .\end{align}\]

Since the limit the infinite term in this series goes to zero as \(n\) goes to infinity, then the series is convergent for all values of \(x\).

Differentiating the series gives us the Taylor series for \(\cos(x)\):

\[\cos(x)=1-\dfrac{x^2}{2!}+\dfrac{x^4}{4!}-\dfrac{x^6}{6!}+\dfrac{x^8}{8!}-\dfrac{x^{10}}{10!}+\dots  \]

This solution is remarkable .It allows us to define the Sine and Cosine functions mathematically in terms of an infinite series. You might be wondering how the series is defined for any arc length, \(\pi,\; 2\pi,\; 30\pi\) etc.

From the graph of the circle it is clear that its arc length is continuous and passes throughout the same point infinite times as it completes its rounds. For this reason our integral for the inverse sine function could only be solved using imaginary numbers. Our study of the Sine function began with simplicity, rose to reason, climaxed in the abstract, fell into the imaginary, and now ends in perfection.

Our study of Taylor Series showed us how by integrating a function \(f^{(n)}(x)\), \(n\) times, we were able to express \(f(x)\) in terms of its \(n\) derivatives evaluated at a point \(a\) with the last terms being evaluated at some point \(c\), between \(a\) and \(x\). The series took the form

\[f(x)=f(a)+f'(a)(x-a)+f''(a)\dfrac{(x-a)^2}{2!}+f'''(a)\dfrac{(x-a)^3}{3!}+\dots+f^{(n)}(c)\dfrac{(x-a)^n}{n!} \]

Taylor's Theorem thus states:

\[f(x)=\sum_{k=0}^{n-1} f^{(k)}(a)\dfrac{(x-a)^k}{k!}+f^{(n)}(c)\dfrac{(x-a)^n}{n!} \]

The important point to realize here is that \(n\) stands for an integer, such that a finite differentiable function can be expressed as a series of its \(n\) derivatives evaluated at some point \(a\). The last term, \(f^{(n)}(c)\dfrac{(x-a)^n}{n!}\) can be found regardless of the value of \(c\) sine the last derivative of a finite differentiable function is always a constant.

As we saw, expressing finite differentiable functions in terms of a series of powers of \(x\) turned out to be impractical. It is here where we saw the value of extending Taylor's theorem to infinitely differentiable functions.  This would mean we would have an infinite series with infinite derivatives. The only way to prove that such a series could be written and would hold true for any value of \(x\) was by taking the limit of:

\[\lim_{n\rightarrow \infty}f^{(n)}(a)\dfrac{(x-a)^n}{n!}.  \]

If the limit was zero then we know that \(f^{(n)}(a)\dfrac{(x-a)^n}{n!}\), or the last term of the series also goes to zero and therefore the series converges for all values of \(x\). If this limit did not go to zero then we had to modify or restrict the values of such that the series would converge for those values of \(x\). We could then extend this interval for all real numbers.

  • Integrated by Justin Marshall.