Let be a positive decreasing function such that . Then the alternating series

converges. Depending on the rate at which decays it may however take many terms to get a decent approximation of the sum. Examples of such slowly converging series are

A well known method to accelerate the convergence of such alternating series uses the binomial transform of the function as follows. For let be the -th difference of :

Then it turns out that

and the right hand side may converge much faster. The binomial transform for the two examples above can be computed explicitly to obtain

Both transformed series indeed converge much faster than the original ones. You may recognize the first as the series for taken at . The second one is harder to spot but it turns out to be the series of

taken at .

The binomial transform is a great tool if you can find an explicit expression for it such as in both examples above. This is however not a trivial transform in general. If you do not know an explicit expression for the binomial transform then it is also not very convenient to compute: depends on the values for all .

There is however another much simpler method that only involves the repeated difference for a single *fixed* value of . This difference is now easy to compute (numerically) since depends only on consecutive values of . It is convenient for the sake of notation to extend to by setting for all . Then for all .

Now the following equalities hold in which the second and third sums are taken over :

If all terms are positive for then partial sums increase for the second series and decrease for the third from onward. This means that these partial sums are lower and upper bounds for the series which is useful to estimate the remaining error for these partial sums.

As an example take . The series for and now become