Last time | Next time |
Today:
That being said, a theorem below does give us some help in determining the value of (or at least bounds on the value of) the series of a type called an "alternating" series. In particular, we want to look at alternating series like the one below:
So it turns out that if the terms are converging to 0, then the alternating sequence converges to a limit.
Now we usually write our alternating series in the following way, illustrated by Leibniz's test: we assume that the are positive, and the term takes care of the "alternation". (Our textbook switches to for the positive part of the term.)
Our textbook calls this the "Alternating Series Test" (p. 751). The "Furthermore" part our textbook calls the "Alternating Series Estimation Theorem" (p. 754).
We need to show that the terms of the series satisfy the Leibniz test:
Identify the terms , and decide whether they satisfy the Leibniz test.
Remember that for convergence, the conditions of the test need be true only eventually: convergence is all about the tail, not the head of the sequence.
For this one, we need the "Alternating Series Estimation Theorem" (p. 754):
This result says that eventually the ratio of successive terms is effectively constant, : the terms of the sequence approach a "common ratio" as . What kind of series looks like that? A geometric series:
Examples:
The ratio test is effectively a self-referrential comparison test: we compare terms of with other terms (rather than with some other series).
Here's a test that is also self-referrential, but only looks at a single term -- the root test:
This result says that eventually the absolute values of the terms are effectively equal to : what kind of series looks like that? A geometric series!
No wonder the results of the tests look exactly the same.... Too bad we didn't just use the same letter for the limits in both cases. Stewart is better -- he does!
Notice that, once again, limits of sequences plays an important role! Series are just sums of sequences, after all; we're focused on how terms behave (root test), how successive terms behave (ratio test), or how partial sums behave.
Examples:
The terms will be monomials -- constants times powers of x.
Now we realize that, with functions for terms, each value of x specified gives rise to an infinite series, and we might immediately wonder if the series is convergent.
So what kind of function has an infinite number of terms? How do you evaluate such a thing? What kinds of function are these? It turns out that a lot of our old friends can be expressed this way
Now what did we assert about ? That
For what values of x would this converge?
Let's take a look at how well a truncated power series -- a partial sum -- approximates the real function.
Apply the ratio test for arbitrary x, and what do you discover?
So presents us with a case where we have an "infinite radius of convergence".
where . From this (and the power series for ) we can now deduce power series for the functions cosine and sine. Let's do that.
Let's look at an example where our radius of convergence is not so nice: consider . One can show that
Now: for what values of x do you think that this will converge?
Note that this is a variation on the theme of the ratio test: we don't consider the power terms, but only the coefficients.