Alternating Series
A series is called alternating if the sign of each term an switches from positive to negative, or vice versa, as we move to the next term an+1.
A practical example
Here is a classic example of an alternating series
$$ \sum_{n=1}^{\infty} \frac{(-1)^{n-1}}{n} $$
This is the alternating harmonic series.
If we write out the first few terms, the alternating pattern becomes immediately clear
$$ 1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + ... + (-1)^{n-1} \frac{1}{n} $$
This series converges.

That said, not every alternating series converges. Some may diverge, so we need a reliable way to check.
Convergence test for alternating series
To determine whether an alternating series converges, we use a specific test designed for this type of structure.
An alternating series converges if the sequence an of nonnegative terms (an≥0) is decreasing and tends to zero.
This result is known as the Leibniz test.
Example
Let us apply the test to the alternating harmonic series
$$ \sum_{n=1}^{\infty} \frac{(-1)^{n-1}}{n} $$
The nonnegative terms an are those with positive sign, namely the odd-indexed terms
$$ a_1 = 1 \\ a_3 = \frac{1}{3} \\ a_5 = \frac{1}{5} \\ \vdots $$
This sequence is clearly decreasing
$$ a_{2n+1} = 1, \frac{1}{3}, \frac{1}{5}, \frac{1}{7}, ... $$
So the first condition is satisfied.
Note. If the sequence were not decreasing, the test would immediately fail.
Now check the second condition
$$ \lim_{n \rightarrow \infty} a_{2n+1} = 0 $$
The sequence tends to zero, so the second condition is also satisfied.
We can conclude that the alternating harmonic series converges.
Note. This test guarantees convergence, but it does not tell us the value of the sum. As the graph shows, the series converges to a nonzero number.

Why the test works
To understand why this result holds, consider how the partial sums behave.
In the alternating harmonic series, the even-indexed partial sums increase
$$ s_2 \le s_4 \le s_6 \le ... \le s_{2k} $$
while the odd-indexed partial sums decrease
$$ s_1 \ge s_3 \ge s_5 \ge ... \ge s_{2k+1} $$
This creates two sequences: one increasing, one decreasing.

Both sequences are monotone and bounded, so by the monotone sequence theorem, they both converge.
The key point is that the gap between them shrinks to zero
$$ s_{2k+1} - s_{2k} = a_{2k+1} $$
and since
$$ \lim_{k \rightarrow \infty} a_{2k+1} = 0 $$
the two limits must coincide.
Therefore, the entire sequence of partial sums converges, and so does the series.
Corollary: estimating the error
If an alternating series converges, then the absolute error between the sum s and the n-th partial sum sn is at most the next term an+1: $$ | s_n - s | \le a_{n+1} $$
This is extremely useful in practice.
It tells us that when we approximate the series by stopping after n terms, the error we make is no larger than the first term we leave out.
Example
Consider again the alternating harmonic series
$$ 1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + \frac{1}{5} - \frac{1}{6} + ... $$
Take the partial sum with n=2
$$ s_2 = 1 - \frac{1}{2} = 0.5 $$
The next term is
$$ a_3 = \frac{1}{3} $$
So the error satisfies
$$ | s_2 - s | \le \frac{1}{3} $$
This means that the true sum lies within a distance of at most 0.33 from 0.5.

How many terms do we need?
Suppose we want an approximation with an error smaller than 1 percent.
We use the inequality
$$ | s_n - s | \le a_{n+1} $$
and require
$$ \frac{1}{n+1} \le \frac{1}{100} $$
Solving this gives
$$ n = 99 $$
So we need to compute the partial sum s99 to guarantee this level of accuracy.
Why the error bound holds
The even partial sums lie below the true sum, while the odd partial sums lie above it
$$ s_{2k} \le s \le s_{2k+1} $$
Subtracting s2k
$$ 0 \le s - s_{2k} \le s_{2k+1} - s_{2k} $$
But
$$ s_{2k+1} - s_{2k} = a_{2k+1} $$
So the error is bounded by the first omitted term.
This simple idea is what makes alternating series especially useful in numerical approximations.
