Last time: Taylor Series | Next time: More Vector Calculus |
Today:
The coefficients akare "designed" so that after k differentiations of the series, when we evaluate the series we pick off only the kth derivative evaluated at x=c:
If we can bound all derivatives of a function on an interval about x=c, then the Taylor series represents the function there:
The expansion for the binomial series is a classic: it's another of the great achievements of Sir Isaac Newton, one of the founders of calculus:
What about its radius of convergence? Let's look at #84, p. 599.
So are many other concepts from physics, where quantities have a direction and a "strength" -- a rocket in flight, for example. It is moving in a certain direction, and it has a speed, whose value can be represented by the length of a vector.
in which case
Obviously this can be generalized to vectors in four dimensions, five dimensions; even six dimensions. Maybe seven dimensions, too; or even n dimensions.
examples:
example:
Here it is in three-space:
with the obvious changes to formulas because you now have three components, instead of two:
It's clear that we can turn any vector into a unit vector, by simply scaling it:
example:
and, in three-space,
The Triangle inequality basically says that the diagonal of a parallelogram is shorter than or equal to the sum of the two sides:
In the case of Figure 17 (for vectors at right angles to each other)
and the only time you'll have equality is when either a or b is zero.