Last time: | Next time: |
Today:
I'm still looking over your homework. I'll have it for you next time.
Some of you "complained" about the amount of algebra. That's why you want to use Mathematica! Only five of you included Mathematica files. It would have been a good time to use Mathematica.
Perhaps I'm so enthralled with Muller's method because I recall a line from my first numerical analysis textbook (Conte and de Boor), in which they had this to say about Muller's:
"A method of recent vintage, expounded by Muller, has been used on computers with remarkable success. This method may be used to find any prescribed number of zeros, real or complex, of an arbitrary function. The method is iterative, converges almost quadratically in the vicinity of a root, does not require the evaluation of the derivative of the function, and obtains both real and complex roots even when these roots are not simple."
Now the third edition of C&deB was from 1980, but the first edition was written in 1965 -- before we landed on the moon -- so it might be that they meant by "recent vintage" something around that time. In fact, Muller wrote this up in 1956! Perhaps the use of "recent vintage" in textbooks is a mistake....:)
Question: How would you write the Lagrange interpolator to two points?
Even though the order of convergence is less, the routine may run in about the same time as Newton's (depends on the problem, of course).
Notice the increasing powers of $x$. Clearly there's a risk that the coefficients will be of vastly different magnitudes, which could cause severe roundoff errors. This is one of the problems with this formulation.
Many of you may have used this notion to find the coefficients of your Muller quadratic. The problem is one of solving three equations in three unknowns; and since the equations are linear, we have a linear system to solve. More generally:
We'll finish this off by having a look at the relationship between the divided differences and the derivatives of $f$.