Last time | Next time |
(By the way, do you know that there are different sized infinities? "Countably infinite" is the smallest. What's the largest?)
One famous sequence is the Fibonacci sequence, which is defined recursively by naming the first two terms in the sequence explicitly and then describing a pattern for the rest of the terms: \[ \begin{array}{l} {F_1=1}\cr {F_2=1}\cr {F_n = F_{n-1}+F_{n-2} \ \ n \ge 3} \end{array} \] This sequence appears throughout nature, and is named for the 13th century Italian mathematician Fibonacci Pisano.
In other words, a sequence is convergent to \(L\) if we can make the difference between the terms and the limit \(L\) as small as we like (\(\epsilon\)) by merely looking far enough down the road (from \(N\) on).
Hence when I say that "The ratio of successive Fibonacci numbers approaches the golden mean.", what I mean is that the sequence of ratios of successive Fibonacci numbers, \(a_n \equiv \frac{F_{n+1}}{F_n}\), converges to golden mean: \[ \lim_{n \to \infty} a_n=\varphi \]
Hence we can say that the sequence of Fibonacci numbers themselves diverges to infinity: their ratios converge, but the numbers are growing -- exponentially, actually, and a closed-form formula for them is given by \[ F_n = Round\left[\frac{\varphi^n}{\sqrt{5}}\right] \]
What I'm calling a "closed-form solution" (or formula) our authors would call the "an explicit formula for the \(n^{th}\) term of the sequence."
I really don't like problems of the form of Example 5.1 and Checkpoint 5.1! (The correct answer to those is "What do you want it to be?")
The 5.2 example and checkpoint are fine, because we've made clear what the pattern is. Using ellipses (...) is dangerous -- we don't know what happens next, for sure....
A sequence $\{a_n\}$ is called bounded above if there is a number $M$ such that \[ a_n \le M \ \ \ \ \ {\textrm{for all}} \ n \ge 1 \] It is called bounded below if there is a number $m$ such that \[ m \le a_n \ \ \ \ \ {\textrm{for all}} \ n \ge 1 \] If it is bounded both above and below, then it is a bounded sequence.
Hence the sequence of ratios of Fibonacci numbers, \(\{\frac{F_{n+1}}{F_n}\}\), is bounded above (by 2), and bounded below (by 1). The sequence \(\{\frac{F_{n+1}}{F_n}\}\) is a bounded sequence.
It is not monotonic, however. The values bounce above and below the limit, which is \(\varphi\).
It turns out that the \(n^{th}\) Fibonacci number \(F_n\) is given exactly by \[ F_n=\frac{\varphi^n - (-\varphi)^{-n}}{\sqrt{5}} \] That's an explicit function of \(n\).
While this function isn't real-valued defined for some powers of \(x\), we can formally take a limit of \(\{\frac{F_{n+1}}{F_n}\}\), and we'll see that it approaches \(\varphi\): \[ \lim_{n \to \infty}\frac{F_{n+1}}{F_n} = \lim_{n \to \infty}\frac {\frac{\varphi^{n+1} - (-\varphi)^{-(n+1}}{\sqrt{5}}} {\frac{\varphi^n - (-\varphi)^{-n}}{\sqrt{5}}} \] We can eliminate the \({\sqrt{5}}\), and pull a factor of \(\varphi\) from the numerator: \[ \lim_{n \to \infty}\frac {\varphi^{n+1} - (-\varphi)^{-(n+1)}} {\varphi^n - (-\varphi)^{-n}} = \lim_{n \to \infty}\varphi \frac {\varphi^{n} - (-1)^{n+1}\varphi^{-(n+2)}} {\varphi^n - (-1)^n\varphi^{-n}} \] which, upon dividing top and bottom by \(\varphi^n\), becomes \[ \lim_{n \to \infty}\varphi \frac {1 - (-\varphi)^{-(2n+2)}} {1 - (-\varphi)^{-2n}} = \varphi \lim_{n \to \infty} \frac {1 - \frac{(-1)^{n+1}}{\varphi^{2(n+1)}}} {1 - \frac{(-1)^n}{(\varphi)^{2n}}} = \varphi \] We actually rely on the following theorem not explicitly mentioned in the text to finish that argument, which I hope you will accept intuitively!
Theorem 5.3 (Continuous Functions Defined on Convergent Sequences): Consider a sequence $\{a_n\}$ and suppose there exists a real number \(L\) such that the sequence $\{a_n\}$ converges to \(L\). Suppose \(f\) is a continuous function at \(L\). Then there exists an integer \(N\) such that \(f\) is defined at all values \(a_n\) for \(n \ge N\), and the sequence $\{f(a_n)\}$ converges to \(f(L)\).
The "$n \ge n_0$" part says that the squeeze has to be on eventually, but not necessarily from the outset. It's the tail of the sequence that's important for the convergence, not the head.
Note, however, that a sequence can be bounded but not converge -- for example \(\{a_n=(-1)^n\}\).
The point of adding "eventually" is that it is only the tail of a sequence that matters for convergence: what happens for any finite bunch of terms at the beginning won't have any bearing on the convergence as \(n \to \infty\).
Sequences are like the natural numbers (1, 2, 3, $\ldots$.): they have distinct ordered terms traipsing off into the far distance. We're interested in what happens as the terms traipse off. Do they approach a fixed value? Do they oscillate, bouncing back and forth? Do they get larger and larger, or smaller and smaller? Several interesting example are included, such as the Fibonacci numbers (which came about from a silly rabbit population story problem in Fibonacci's Liber Abaci).
In this section we encounter many definitions, and a few theorems which help us to understand when a sequence converges (its terms approach a fixed value), or diverges (doesn't converge!). This is an issue of fundamental importance as we push on to our major objective: representing a function using an infinite sequence of functions! We start with numbers, of course, because that's a simpler case.
So the analogy we're working from is that \(\varphi\) can be approximated arbitrarily well by a sequence of ratios of Fibonacci numbers \(\{\frac{F_{n+1}}{F_n}\}\) as \(n \to \infty\).
We'll soon be doing the same thing for functions, rather than \(\varphi\)....
Let's take a look at some of the questions with your partner.
Naturally, this sounded a little odd to folks for a long time. It may even sound odd to you, and that's okay!
Zeno was a sophist -- like a lawyer -- who would take any case, and argue any position. He wrote several "paradoxes" that explored infinity, motion, etc., and tried to wrap his head around some very difficult ideas.
In one of his paradoxes, he showed that motion was impossible: thank God it appears that he was wrong!
He argued this way: to get to a wall, you have to first go half way; but to go half way, you have to go half of a half, or a quarter of the way; and before you go a quarter, you have to go an eighth; and so on, ad infinitum. So you can never get started, he argued.
Who can argue with that?:)
He would have found it odd that we might write \[ \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \frac{1}{16} +\ldots + \frac{1}{2^n} + \ldots = 1 \ (\textrm{Ouch!}) \] and so assert that the arrow would, indeed, get there. So we want to investigate this idea of adding up an infinity of numbers and yet getting a finite sum....