Last time | Next time |
I've just received the monthly rain measurements for each of the cities, too, which is exciting. Haven't even looked at them yet.
I'll be assigning groups and towns Friday, after we discuss the fit to the Keeling data in class.
We too quickly finished off confidence intervals for parameters (essentially equivalent to p-values, but perhaps more informative).
The standard errors of the parameters pop out of the inverse matrix we compute, multiplied by the mean SSE. Once we have those, we have everything we need for confidence intervals.
We're often interested to know whether we can exclude a certain value from a confidence interval -- e.g., can we conclude that the slope parameter $b$ in a linear regression $y(x)=a+bx$ is not 0? If so, we conclude that there is a non-zero slope, and the model suggests that $x$ drives values up or down, depending on the sign of $b$.
All of these pop out of the model we obtain via linear algebra.
Patrick proposed an exponential model for the Keeling data, but it is non-linear in the parameters: \[ y(t)=ae^{bt} \]
Yet we can still use linear regression to fit it: how so? Start by using Mathematica's LinearModelFit to find linear models that fit Stewart's Keeling data.
How do we interpret the power?
Does the exponential model suffer the same problem? What is the impact of a shift in time scale?
Alexander also illustrates that one cannot simply invert the regression equation $y=a+bx$ to get the regression equation $x=\frac{y-a}{b}$. So it really matters in linear regression which variable is considered "independent" and which is considered "dependent".