-
In our work on series, we've emphasized two things:
- Does it converge?
- If so, can we approximate it (with
given precision) using a partial sum?
We generally don't know the value of a series -- in fact, we're trying
to approximate a value using the series, because series are
essentially infinite polynomials, and we can compute them using
only the operations of +, -, *, and /.
- For example, there is no magic way of evaluating the sine of
1.235. The only way we have of evaluating it is by
approximating sine by a Taylor polynomial, and knowing that the
answer we give is accurate to within a given tolerance.
Perhaps you think "We can ask Mathematica!". But all
Mathematica is doing is using some Taylor polynomial to give
the answer to with a given tolerance....
- There are several important functions whose series (Taylor series
expansions about 0) you should know:
- $e^x$
- $\sin(x)$
- $\cos(x)$
- $\frac{1}{1-x}$
- We can obtain many others by integrating or differentiation
these. For example, the series for $\cos(x)$ is easily obtained
as the derivative of the series for $\sin(x)$:
\[
\sin(x)=\sum_{n=0}^{\infty}\frac{(-1)^n x^{2n+1}}{(2n+1)!}
\]
Differentiate term by term, to obtain
\[
\cos(x)=\sin'(x) = \sum_{n=0}^{\infty}\frac{(-1)^n x^{2n}}{(2n)!}
\]
- We obtain others by composition: so the series for $e^{-x^2}$ is
given easily from the series for $e^x:$
\[
e^x=\sum_{n=0}^{\infty}\frac{x^n}{n!}
\]
so
\[
e^{-x^2}=
\sum_{n=0}^{\infty}\frac{(-x^2)^n}{n!}=
\sum_{n=0}^{\infty}\frac{(-1)^n x^{2n}}{n!}
\]
is the Taylor series for $e^{-x^2}$ about 0.
Recall that we don't have an antiderivative for $e^{-x^2}$ --
the bell-shaped curve -- but we can use the Taylor series to
compute an area under the curve -- an integral -- to within a
given tolerance by integrating this series term-by-term. The
result is an alternating series of terms which eventually
decrease and head to zero in size (and so the series converges
by the AST).
- We've seen that series have this strange property, that
they may converge for only a very narrow set of values -- which
may seem rather mysterious. For example
\[
\frac{1}{1-x} = \sum_{n=0}^\infty x^n
\]
is the Taylor series about 0, but it only converges in a
"radius" of 1 unit around 0. The problem is that $x=1$ makes
the function "explode" -- and that cripples the series even on
the other side -- at $x=-1$ -- where there doesn't appear to be
a problem. Very strange!
We can find the radius of convergence using the ratio test,
after which we check the endpoints (if finite) to create the
interval of convergence.
- So we're going to be applying those tests from the last
exam, e.g. AST or comparisons.
- Remember that Taylor remainder theorem, which states that
if $T_n(x)$ is the $n^{th}$ Taylor polynomial about center $x=a$,
then the error in using $T_n(x)$ to approximate $f(x)$ is
bounded by
\[
|R_n(x)| \le \frac{M}{(n+1)!}|x-a|^{n+1}
\]
and where $M$ is a bound on the absolute value of the $n^{th}$
derivative $f^{(n)}$over the symmetric interval $[a-x,a+x]$
about $a$, the center of the Taylor series.