FOWKFOWKIE
New member
- Joined
- Nov 27, 2021
- Messages
- 1
Consider the simple linear regression model \(\displaystyle y=\beta_{0}+\beta_{1} x+\varepsilon \) with \(\displaystyle \varepsilon \sim N I D\left(0, \sigma^{2}\right)\).. Then give a concise but clear proof of the following.
(1) The least squares estimators $\hat{\beta}_{0}$ and $\hat{\beta}_{1}$ are uncorrelated.
(2) The covariance between $\bar{y}$ and $\hat{\beta}_{1}$ is zero.
(3) Show that $\hat{y}_{0}$ is an unbiased predictor of $y_{0}$.
(4) Use $b$ to show that $P V\left(\hat{y}_{0}\right)=\sigma^{2}\left(1+\frac{1}{n}+\frac{\left(x-\bar{x}_{0}\right)^{2}}{S_{x x}}\right)$
(5) the maximum value of $r^{2}$ is less than 1 if the data contain repeated (different) observations on $y$ at the same value of $x$.
(1) The least squares estimators $\hat{\beta}_{0}$ and $\hat{\beta}_{1}$ are uncorrelated.
(2) The covariance between $\bar{y}$ and $\hat{\beta}_{1}$ is zero.
(3) Show that $\hat{y}_{0}$ is an unbiased predictor of $y_{0}$.
(4) Use $b$ to show that $P V\left(\hat{y}_{0}\right)=\sigma^{2}\left(1+\frac{1}{n}+\frac{\left(x-\bar{x}_{0}\right)^{2}}{S_{x x}}\right)$
(5) the maximum value of $r^{2}$ is less than 1 if the data contain repeated (different) observations on $y$ at the same value of $x$.
Last edited by a moderator: