Last time: | Next time: |
Today:
At least for those of you who used python or Mathematica.
If you java folk (Eden, Aubrey, and Brian) would stop by at some point with a laptop, we'll just do a few spot checks on your code.
And Aaron, I need a little help with your executable. I was going to run some tests, and had trouble getting it to run under my Windows....
Here are links to a couple example codes that worked fine:
Notice that I have version of Muller's method in Mathematica that you're free to use.
Perhaps I'm so enthralled with Muller's method because I recall a line from my first numerical analysis textbook (Conte and de Boor), in which they had this to say about Muller's:
"A method of recent vintage, expounded by Muller, has been used on computers with remarkable success. This method may be used to find any prescribed number of zeros, real or complex, of an arbitrary function. The method is iterative, converges almost quadratically in the vicinity of a root, does not require the evaluation of the derivative of the function, and obtains both real and complex roots even when these roots are not simple."
Now the third edition of C&deB was from 1980, but the first edition was written in 1965 -- before we landed on the moon -- so it might be that they meant by "recent vintage" something around that time. In fact, Muller wrote this up in 1956! Perhaps the use of "recent vintage" in textbooks is a mistake....:)
In what sense is a Taylor series polynomial an interpolant? Ordinarily we've been thinking of it as an approximation to a function.
On the other hand, we are often interpolating data (but just not points, as in the case of Muller, but rather points and slopes and higher derivatives at a single point.
So we might interpolate the point (0,0), with slope 0, and second derivative 0, by the quadratic function
This function fits all that data exactly (and passes through exactly one known point). This is the Taylor series polynomial of degree 2 subject to those constraints at .
A tangent line is the Taylor series polynomial of degree 1, which fits a point and gets the slope right.
You have to be careful that you're not deceived by the form of the expression to the right. Every function is not polynomial -- that's not what it's saying. All the interesting stuff is buried in that Greek letter "xi".
An important observation is made about the computation of these polynomials. It is the introduction of Horner's rule (or method).
This is generally how all polynomials should be evaluated.
Muller's method is a generalization of the Secant method (which is an approximation to Newton's method).
What if we go straight from Newton's method to a quadratic method?
Given function , suppose we know both and .
Given an initial guess to the true root . What quadratic would we use to seek the next approximation?
We might start with the "classic method", and see what conditions the a, b, and c must satisfy. We'll re-derive the Taylor series polynomial of degree 2.
A hint for part b: logs of products
Even though the order of convergence is less, the routine may run in about the same time as Newton's (depends on the problem, of course).
Notice the increasing powers of . Clearly there's a risk that the coefficients will be of vastly different magnitudes, which could cause severe roundoff errors. This is one of the problems with this formulation.
Many of you may have used this notion to find the coefficients of your Muller quadratic. The problem is one of solving three equations in three unknowns; and since the equations are linear, we have a linear system to solve. More generally:
We'll finish this off by having a look at the relationship between the divided differences and the derivatives of .