Algorithms and Convergence
There are three separate issues discussed in this section:
An algorithm is a recipe for completing a task. As we've seen, algorithms giving the same answer from the purely mathematical standpoint may give radically different answers from a numerical perspective. So we want to make good choices when we create algorithms.
If an algorithm has the property that small changes in initial conditions produce small changes in the solution, then the algorithm is stable; otherwise it is unstable. Some algorithms are stable for a range of initial data, and they might be categorized as conditionally stable.
``Nice'' errors (if there can be such a thing!) have the property that errors introduced at the outset grow linearly, i.e. as
where C>0 is independent of n. If, on the other hand, the errors grow exponentially,
where C>1 is independent of n, then we're probably going to be in trouble before we'd like! The good news is that your bank account grows unstably, if you like!
for large n, then converges to with rate of convergence .
for sufficiently small h, then F(h)=L + O(G(h)).
is generally of the form of powers of .