Last time | Next time |
If your group has any questions, feel free to stop by and talk to me!
Any questions on that?
The regression equations provided us "point estimates" for $a$ and $b$ in the model, but did not give us an confidence in those values. It would be nice to know how much confidence we have in them (and, in particular, we would like to know that the slope parameter $b$ is actually significantly different from 0).
This first contact introduces us to vector and matrix notation, and a few important operations (e.g. transpose and inverse). But the details will be left to a linear algebra course!
Matrices will be important in other models we consider, so let's be glad we meet them early.
How would you adjust things to get the quadratic fit?
Alexander also illustrates that one cannot simply invert the regression equation $y=a+bx$ to get the regression equation $x=\frac{y-a}{b}$. So it really matters in linear regression which variable is considered "independent" and which is considered "dependent".