Chapter 9
Probability and Integration
Chapter Summary
Chapter Review
Our investigation of reliability theory opened up a new area of application of the definite integral. This study introduced us to the concept of exponential distribution of data and to more general continuous probability distributions. Each of these distributions may be characterized by the probability distribution function `F`, where `F text[(] t text[)]` is the probability that a random data value from a set with this distribution is less than `t`. Equivalently, the distribution may be characterized by the probability density function `f`, the derivative of `F`.
Our attempt to calculate the expected value or mean of the exponential distribution led us to investigate integrals from `0` to `oo`. These "improper" integrals are defined as limiting values of definite integrals `int_0^T g text[(] t text[)] dt` as `T` approaches infinity. We used this opportunity to introduce formal notation for this and similar limiting processes:
`int_0^oo g text[(] t text[)] dt=lim_(T rarr oo) int_0^T g text[(] t text[)] dt`.
The most important continuous probability distributions are the normal distributions. In order to reduce the study of this class of distributions to the study of a single distribution, we considered standardization of a finite data set: Each data value `v` is replaced by the value
(`v -`mean)/(standard deviation).
This transforms a general normally distributed data set into one with the standard normal distribution.
The probability density function for the standard normal distribution is `f text[(] t text[)] =c e^(-t^2text[/]2)` for `-oo<t<oo`, where `c~~0.3989`. Our attempt to describe the corresponding standard normal distribution function led us to a new function, the error function. The question of how a computer might evaluate this function sets the stage for our study
in Chapter 10 of approximation of functions by polynomials.