Last time: | Next time: |
Today:
I want to cover root-finding, and interpolation, but I feel like we won't have had enough time to cover all the interpolation issues.
Even though the order of convergence is less, the routine may run in about the same time as Newton's (depends on the problem, of course).
The authors observe that, for a much-used and computationally intense function, it may be quite a bit faster to compute it up front for a boatload of points, and then use an interpolator to give a pretty good approximation for general use.
That's the idea that's being explored here, for $e^x$. Suppose we want to compute it on the interval $[0,1]$ with an error of at most $10^{-6}$: how many points would a linear interpolator require?
Here's another picture of how we might build one of these.
I can think of several ways of approaching the fitting of this data with the Hermite interpolant. How might you go about it?
You might not be surprised to learn that about the easiest formulation is Newton's, only we're going to repeat some abscissa:
Since we'll need this over and over, we should go ahead and simplify those divided difference coefficients, using the assumption that $f'$ is known (use the recurrence relation at the top of p. 215 to do so).
This will have the effect of fitting the two endpoints, and the derivatives of $f$ at the two endpoints.
I'd like to discuss my implementation of this in Mathematica.
What error are we making when we use Hermite interpolation? Proposition 5.6 gives us the answer (p. 219):
which is equal to a derivatives times some stuff:
where $\xi$ is some unknown number between the minimum and maximum of $\{x_1,x_2,x\}$.
While my Hermite spline does the job (approximating Sine for all real numbers), one might do the same job with Taylor polynomials.