Last time | Next time |
Today:
Let me get some help on going over the exam (here's your key):
If you didn't send me a text-based (rather than image-based) version of your code, then please send that to me in text form (so that I can run your programs).
In that sense, there is something fundamental about the golden ratio in the universe (besides it appearing in your exam a lot...:).
Perhaps I'm so enthralled with Muller's method because I recall a line from my first numerical analysis textbook (Conte and de Boor), in which they had this to say about Muller's:
"A method of recent vintage, expounded by Muller, has been used on computers with remarkable success. This method may be used to find any prescribed number of zeros, real or complex, of an arbitrary function. The method is iterative, converges almost quadratically in the vicinity of a root, does not require the evaluation of the derivative of the function, and obtains both real and complex roots even when these roots are not simple."
Now the third edition of C&deB (which I used!) was from 1980, but the first edition was written in 1965 -- before we landed on the moon -- so it might be that they meant by "recent vintage" something around that time. In fact, Muller wrote this up in 1956! Perhaps the use of the expression "recent vintage" in textbooks is a mistake....:)
What do you think of the additional quadratic method, using a "tangent parabola" for root-finding? These are usually referred to as "higher-order Newton methods" (rather than "Long's method", which I kind of prefer...)
A few words, perhaps, before we dig into methods. The point of an interpolator, as our author notes in the first couple of lines of Chapter 3, is to fit some points -- sometimes called "knots" -- exactly (and hope that it serves as a decent approximation elsewhere).
Polynomials are a great class for doing this, but there are things that polynomials can't do. They're terribly smooth, for example! (They're horrible at being horrible.)
And there are things that they do that we really don't want them doing. For example, they wiggle a lot when they get to high degree.
Our author spotlights a fairly horrible looking function in section 3.1 (a recursively defined function, that has an interesting scalloped shape. To be honest, I tried to figure out why we should include this section, and I'm at a loss. Perhaps it will become clearer later (and he does refer to the function briefly in section 3.2), but we're going to get right to the techniques. And we'll start with Lagrange's interpolating polynomials.
We want to write a linear function as a sum of two linear functions, each of which "takes care of one point, and gets out of the way at the other". What does this mean? These "basis" functions will have that property that
\(l(x,x_1,x_2)\) will be 1 at \(x_1\), and 0 at \(x_2\).
Then \[ f(x)-P_{n}(x)=\frac{f^{(n+1)}(\xi_{x})}{(n+1)!}(x-x_{0})(x-x_{1})\cdots(x-x_{n}). \]