Last time: | Next time: |
Today:
My answer begins with the word "derivatives", and by generalizing Newton's method: it is an example of a process called "fixed point iteration".
We'll pick up where we left off last time. Our objective today is to finish up chapter 3, by getting through the Secant method.
Now here is how a problem looks using Newton's method:
Newton's method as Fixed Point Iteration
Definition 3.3 (p. 87): If a method converges to root , and if
then the method is convergent of order , with asymptotic error constant .
We can show that Newton's convergence is quadratic (with Taylor polynomials).
Question: What would you expect for bisection? You might think that it is linear, but interesting things can happen....
Let's
Examples:
The rule is that the general fixed-point method has order of convergence 1 (linear), with asymptotic error constant (where is the root).
Nonetheless, one can say that we narrow down the interval in which a root is found by one binary digit (that is, "linearly" -- the ratio of interval widths is 1/2 --
Finally we're getting to the secant method, and we'll then take it one step further and consider Muller's method (based on quadratic functions -- which I've asked you to derive and demonstrate as a homework assignment, due next Thursday).
Secant is Newton's method, where we approximate the derivative: in particular, we approximate the tangent line with a secant line, using a finite difference approximation to the derivative.
One thing that we notice right away is that this method requires two approximations to start -- to prime the pump.
To demonstrate that, I won't start from scratch, but from a result which is sensible (at any rate):
and
It turns out that the Secant method may be preferable to Newton's, because
So even though the order of convergence is less, the routine may run in about the same time as Newton's (depends on the problem, of course).