# 11.3: The Integral Test and Estimates of Sums

- Page ID
- 4516

It is generally quite difficult, often impossible, to determine the value of a series exactly. In many cases it is possible at least to determine whether or not the series converges, and so we will spend most of our time on this problem.

If all of the terms \( a_n\) in a series are non-negative, then clearly the sequence of partial sums \( s_n\) is non-decreasing. This means that if we can show that the sequence of partial sums is bounded, the series must converge. We know that if the series converges, the terms \( a_n\) approach zero, but this does not mean that \( a_n\ge a_{n+1}\) for every \(n\). Many useful and interesting series do have this property, however, and they are among the easiest to understand. Let's look at an example.

Example 11.3.1

Show that \[\sum_{n=1}^\infty {1\over n^2}\] converges.

**Solution**

The terms \( 1/n^2\) are positive and decreasing, and since \(\lim_{x\to\infty} 1/x^2=0\), the terms \( 1/n^2\) approach zero. We seek an upper bound for all the partial sums, that is, we want to find a number \(N\) so that \(s_n\le N\) for every \(n\). The upper bound is provided courtesy of integration, and is inherent in figure __11.3.1__.

The figure shows the graph of \( y=1/x^2\) together with some rectangles that lie completely below the curve and that all have base length one. Because the heights of the rectangles are determined by the height of the curve, the areas of the rectangles are \( 1/1^2\), \( 1/2^2\), \( 1/3^2\), and so on---in other words, exactly the terms of the series. The partial sum \( s_n\) is simply the sum of the areas of the first \(n\) rectangles. Because the rectangles all lie between the curve and the \(x\)-axis, any sum of rectangle areas is less than the corresponding area under the curve, and so of course any sum of rectangle areas is less than the area under the entire curve, that is, all the way to infinity. There is a bit of trouble at the left end, where there is an asymptote, but we can work around that easily. Here it is:

\[ s_n={1\over 1^2}+{1\over 2^2}+{1\over 3^2}+\cdots+{1\over n^2} < 1 + \int_1^n {1\over x^2}\,dx < 1+\int_1^\infty {1\over x^2}\,dx =1+1=2, \]

recalling that we computed this improper integral in section __9.7__. Since the sequence of partial sums \( s_n\) is increasing and bounded above by 2, we know that \(\lim_{n\to\infty}s_n=L < 2\), and so the series converges to some number less than 2. In fact, it is possible, though difficult, to show that \( L=\pi^2/6\approx 1.6\).

We already know that \(\sum 1/n\) diverges. What goes wrong if we try to apply this technique to it? Here's the calculation:

\[ s_n={1\over 1}+{1\over 2}+{1\over 3}+\cdots+{1\over n} < 1 + \int_1^n {1\over x}\,dx < 1+\int_1^\infty {1\over x}\,dx =1+\infty. \]

The problem is that the improper integral doesn't converge. Note well that this does *not* prove that \(\sum 1/n\) diverges, just that this particular calculation fails to prove that it converges. A slight modification, however, allows us to prove in a second way that \(\sum 1/n\) diverges.

Example

Consider a slightly altered version of figure __11.3.1__, shown in figure __11.3.2__.

**Solution**

The rectangles this time are above the curve, that is, each rectangle completely contains the corresponding area under the curve. This means that

[(s_n = {1\over 1}+{1\over 2}+{1\over 3}+\cdots+{1\over n} > \int_1^{n+1} {1\over x}\,dx = \ln x\Big|_1^{n+1}=\ln(n+1).\]

As \(n\) gets bigger, \(\ln(n+1)\) goes to infinity, so the sequence of partial sums \( s_n\) must also go to infinity, so the harmonic series diverges.

The important fact that clinches this example is that \(\lim_{n\to\infty} \int_1^{n+1} {1\over x}\,dx = \infty,\) which we can rewrite as \(\int_1^\infty {1\over x}\,dx = \infty.\) So these two examples taken together indicate that we can prove that a series converges or prove that it diverges with a single calculation of an improper integral. This is known as the **integral test**, which we state as a theorem.

Theorem 11.3.3: The Integral Test

Suppose that \(f(x)>0\) and is decreasing on the infinite interval \([k,\infty)\) (for some \(k\ge1\)) and that \( a_n=f(n)\). Then the series

\[\sum_{n=1}^\infty a_n\]

converges if and only if the improper integral

\[\int_{1}^\infty f(x)\,dx\]

converges.

The two examples we have seen are called \(p\)-series; a \(p\)-series is any series of the form \( \sum 1/n^p\). If \(p\le0\), \(\lim_{n\to\infty} 1/n^p\not=0\), so the series diverges. For positive values of \)p\) we can determine precisely which series converge.

Theorem 11.3.4

A \(p\)-series with \(p>0\) converges if and only if \(p>1\).

**Proof**

We use the integral test; we have already done \(p=1\), so assume that \(p\not=1\).

\[\int_1^{\infty} {1\over x^p}\,dx=\lim_{D\to\infty} \left.{x^{1-p}\over 1-p}\right|_{1}^D=\lim_{D\to\infty} {D^{1-p}\over 1-p}-{1\over 1-p}. \]

If \(p>1\) then \(1-p < 0\) and \(\lim_{D\to\infty}D^{1-p}=0\), so the integral converges. If \(0 < p < 1\) then \(1-p>0\) and \(\lim_{D\to\infty}D^{1-p}=\infty\), so the integral diverges.

Example 11.3.5

Show that \[\sum_{n=1}^\infty {1\over {n^3}}\] converges.

**Solution**

We could of course use the integral test, but now that we have the theorem we may simply note that this is a \(p\)-series with \(p>1\).

Example 11.3.6

Show that

\[\sum_{n=1}^\infty {5\over n^4}\]

converges.

**Solution**

We know that if

\[ \sum_{n=1}^\infty 1/n^4\]

converges then

\[ \sum_{n=1}^\infty 5/n^4\]

also converges, by theorem __11.2.2__. Since \( \sum_{n=1}^\infty 1/n^4\) is a convergent \(p\)-series, then \( \sum_{n=1}^\infty 5/n^4\) converges also.

Example 11.3.7

Show that

\[\sum_{n=1}^\infty {5\over \sqrt{n}}\]

diverges.

**Solution**

This also follows from theorem __11.2.2__: Since \(\sum_{n=1}^\infty {1\over \sqrt{n}}\) is a \(p\)-series with \(p=1/2 < 1\), it diverges, and so does \(\sum_{n=1}^\infty {5\over \sqrt{n}}\).

Since it is typically difficult to compute the value of a series exactly, a good approximation is frequently required. In a real sense, a good approximation is only as good as we know it is, that is, while an approximation may in fact be good, it is only valuable in practice if we can guarantee its accuracy to some degree. This guarantee is usually easy to come by for series with decreasing positive terms.

Example 11.3.8

Approximate \[ \sum 1/n^2\] to two decimal places.

**Solution**

Referring to figure __11.3.1__, if we approximate the sum by \( \sum_{n=1}^N 1/n^2\), the error we make is the total area of the remaining rectangles, all of which lie under the curve \( 1/x^2\) from \)x=N\) out to infinity. So we know the true value of the series is larger than the approximation, and no bigger than the approximation plus the area under the curve from \(N\) to infinity. Roughly, then, we need to find \(N\) so that

\[\int_N^\infty {1\over x^2}\,dx < 1/100.\]

We can compute the integral: \(\int_N^\infty {1\over x^2}\,dx = {1\over N},\) so \(N=100\) is a good starting point. Adding up the first 100 terms gives approximately \(1.634983900\), and that plus \(1/100\) is \(1.644983900\), so approximating the series by the value halfway between these will be at most \(1/200=0.005\) in error. The midpoint is \(1.639983900\), but while this is correct to \(\pm0.005\), we can't tell if the correct two-decimal approximation is \(1.63\) or \(1.64\).

We need to make \(N\) big enough to reduce the guaranteed error, perhaps to around \(0.004\) to be safe, so we would need \(1/N\approx 0.008\), or \(N=125\). Now the sum of the first 125 terms is approximately \(1.636965982\), and that plus \(0.008\) is \(1.644965982\) and the point halfway between them is \(1.640965982\). The true value is then \(1.640965982\pm 0.004\), and all numbers in this range round to \(1.64\), so \(1.64\) is correct to two decimal places. We have mentioned that the true value of this series can be shown to be \( \pi^2/6\approx1.644934068\) which rounds down to \(1.64\) (just barely) and is indeed below the upper bound of \(1.644965982\), again just barely. Frequently approximations will be even better than the "guaranteed'' accuracy, but not always, as this example demonstrates.