Last time | Next time |
You'll know by the end of the day. I plan to grade those in the afternoon.
We love polynomials because they're so simple; we use polynomials to approximate functions for the same reason that we use rectangles to approximate general areas -- because they're simple!
But the most important thing about polynomials is that they only require four operations for their evaluation: addition, subtraction, multiplication, and division -- and those are the four specialities of computers. They can do all of those, but they can't do sines or exponentials or arctans. So we approximate those very interesting and important functions with polynomials, and let the computers evaluate those instead.
Let's suppose that we're trying to approximate $g(x)$ with an $n^{th}$-degree polynomial at $x=a$:
\[ T_n(x) \equiv \sum_{k=0}^{n}\frac{g^{(k)}(a)}{k!}(x-a)^k \]
defined by matching up the function value and first $n$ derivatives of $g$ at $x=a$ with those of $T_n$.
\[ g(x) \approx T_n(x) = \sum_{k=0}^{n}\frac{g^{(k)}(a)}{k!}(x-a)^k \]
But generally they're not the same; and so we expect that we're making errors, and we can define the Taylor's remainder as
\[ R_n(x)=g(x)-T_n(x) \]
If $T_n(x)$ fit $g(x)$ perfectly on its domain, $R_n$ would be zero. But we don't expect that; and we expect the error to increase as we move further away from $x=a$.