Need help with this demonstration

ewilliams

New member
Joined
Oct 26, 2019
Messages
4
Supposing that we have a regression model that fits the conditions of normal, homoscedasticity and independent residuals, I would like to demonstrate that the variance of the estimators is:

[MATH]Var(\hat{\beta_j}) = \frac{\sigma^2}{nS^2_j(1-R^2_j)}; j=1,...,t[/MATH]
Where [MATH]R_j^2[/MATH] is the Coefficient of Determination and [MATH]S^2_j = \frac{1}{n}\sum_{i=1}^n(x_{ij}-\hat{x}_j)^2[/MATH]
How should I start this?
 
Supposing that we have a regression model that fits the conditions of normal, homoscedasticity and independent residuals, I would like to demonstrate that the variance of the estimators is:

[MATH]Var(\hat{\beta_j}) = \frac{\sigma^2}{nS^2_j(1-R^2_j)}; j=1,...,t[/MATH]
Where [MATH]R_j^2[/MATH] is the Coefficient of Determination and [MATH]S^2_j = \frac{1}{n}\sum_{i=1}^n(x_{ij}-\hat{x}_j)^2[/MATH]
How should I start this?
You said "...regression model that fits the conditions of normal, homoscedasticity ...."

What is/are the condition/s of homoscedasticity ?
 
What is/are the condition/s of homoscedasticity ?

A homoscedastic regression model is a model which error has always the same variance. This property, as Gauss-Márkov theorem states: is a necessary condition to achieve a model with the best and unbiased estimators.
 
A homoscedastic regression model is a model which error has always the same variance. This property, as Gauss-Márkov theorem states: is a necessary condition to achieve a model with the best and unbiased estimators.
Yes- but how would you express the above criterion - mathematically.
 
I am not finding the point of your joke
It's a big and not commonly used word. (At least it is unknown to me anyway.) I was pretending it was some kind of weird sound and I likened it to the sound you make when you sneeze.

It really wan't a good joke. No worries.

-Dan
 
Yes- but how would you express the above criterion - mathematically.

Let's suppose we have a regression model represented by: [MATH]\vec{y}=X\vec{\beta}+\vec{\epsilon}[/MATH] where [MATH]\epsilon[/MATH] is error
And supposing [MATH]E[\vec{\epsilon}]=\vec{0}[/MATH], the model is homoscedastic if it fits:
[MATH]Var[\vec{\epsilon}]=E[\vec{\epsilon}\vec{\epsilon}^{\,t}]=\sigma^2I[/MATH] (components of vector [MATH]\epsilon[/MATH] are linearly independent and homoscedastic)
 
Top