MAT360 Section Summary: 3.2

Divided Differences

  1. Summary

    We start with the interpolating polynomial of degree n given in Newton form:

    displaymath242

    where the coefficients tex2html_wrap_inline270 are to be determined. If all the tex2html_wrap_inline272 , then we have the Maclaurin expansion of the polynomial; if all tex2html_wrap_inline274 for some constant tex2html_wrap_inline276 , then we have the Taylor series expansion of tex2html_wrap_inline278 about c. In either event, the coefficients will be scaled derivatives at a single point.

    We're now interested in the case where the tex2html_wrap_inline282 are different (possibly simply equally spaced points); the coefficients will still be related to derivative information, but that information will be distributed across the tex2html_wrap_inline282 , rather than focused on a point. The key to computing them will be divided differences.

    Divided differences are basically approximations to derivatives, as one can see from Theorem 3.6.

    This is a somewhat ``classical'' subject (old-fashioned?): one used to consult tables to evaluate functions, whereas today we've got computers which have been programmed to do the job for us. Textbooks used to include lots of tables (e.g. of trigonometric functions), and if you wanted tex2html_wrap_inline286 , you'd look in the table, find the values of tex2html_wrap_inline288 and tex2html_wrap_inline290 , and interpolate! Nowadays, this is rarer, but you may have still had to do such things in a statistics class, for example, where tables of normal probabilities or t-distribution values are still used....

  2. Definitions

  3. Theorems/Formulas

    Theorem 3.6: Suppose that tex2html_wrap_inline298 and tex2html_wrap_inline300 are distinct numbers in [a,b]. Then tex2html_wrap_inline304

    displaymath247

    Derivation of the coefficients of the Newton polynomial: Rather than define the divided differences as above, we could generate a recursive definition for them.

    Define the term tex2html_wrap_inline306 as the leading coefficient of tex2html_wrap_inline308 :

    displaymath248

    Since

    displaymath249

    where tex2html_wrap_inline310 is the tex2html_wrap_inline312 interpolating polynomial to tex2html_wrap_inline314 , and tex2html_wrap_inline316 is the tex2html_wrap_inline312 interpolating polynomial to tex2html_wrap_inline320 , we can compute the leading coefficient of tex2html_wrap_inline278 as a function of the leading coefficients of tex2html_wrap_inline324 and tex2html_wrap_inline326 :

    displaymath250

    Hence,

    displaymath251

    When you have a recursive definition, you need a basement: it is the leading coefficient of the constant interpolating function:

    displaymath252

    Conclusion: the Newton interpolating polynomial is given by

    displaymath246

    To get the polynomial's coefficients, you simply look along the diagonal of the divided difference table. Computation of the coefficients costs

    displaymath254

    operations, whereas evaluation involves 4n-1 operations. While this seems expensive compared to Horner's method, we generate a succession of estimates (using the 0th through nth degree polynomials). So there is bang for the buck....

  4. Properties/Tricks/Hints/Etc.

    This form of the interpolating polynomial is nice because we can easily increase the degree by adding an additional knot, without much additional work. We can reuse the interpolating polynomial of previous degree. Furthermore, the knots don't have to be added at the end or beginning: this process was independent of the order of the tex2html_wrap_inline282 .

    By following the table of divided differences up from the function values, we get a look at the estimated value of the function using higher and higher powered polynomials. We hope that if the values are settling down, that we're doing a pretty good job of approximating it (and that we needn't proceed to even higher degree). If we're not settling down, however, we know that with the Newton formulation, we can do even better.

    If you're at the ``left hand side'' (near tex2html_wrap_inline332 ) of the table, then it makes sense to make use of the forward-differences; if you're at the right of the table, then use the backward-differences. The only advantage of using one side versus the other is in watching the value of P(x) stabilize as additional degreed polynomials are used; using either form gives the same value for the highest degreed approximation. So the authors' injunction that ``The Newton formulas are not appropriate for approximating f(x) when x lies near the center of the table....'' are a little draconian.... Use them!




Fri Oct 14 11:20:12 EDT 2005