1
$\begingroup$

I'm working through a proof of the least squares estimators, i.e. proving that estimators of the regression coefficients are,

$$\hat{\beta}_0=\bar{y}-\hat{\beta}_1 \bar x$$ and, $$\hat{\beta}_1=\frac{S_{xy}}{S_x^2}$$

So far in my proof, I've got to this line: $$\sum_{i=1}^n y_i -n\beta_0 -\beta_1\sum_{i=1}^nx_i = 0$$ (1)

Which I'm told leads to:

$$\hat{\beta_0}=\frac{1}{n}\bigg(\sum_{i=1}^n y_i -{\beta_1}\sum_{i=1}^nx_i\bigg)$$(2) $$\hat \beta_0 = \bar y - \hat \beta_1 \bar x$$(3)

My problem is that I don't understand why $\beta_0$ became $\hat \beta_0$ on the second line, and why $\beta_1$ became $\hat \beta_1$ on the third line. To me, the first line leads to this:

$${\beta_0}=\frac{1}{n}\bigg(\sum_{i=1}^n y_i -{\beta_1}\sum_{i=1}^nx_i\bigg)$$ $$\beta_0 = \bar y -\beta_1 \bar x$$

But of course, I wouldn't know how this then leads to the desired result. So why do the regression coefficients suddenly become their estimator counterparts in lines 2 and 3?

$\endgroup$
3
  • $\begingroup$ Are you trying to prove the linear regression equations? $\endgroup$ Commented May 5, 2018 at 17:04
  • $\begingroup$ @CogitoErgoCogitoSum Trying to prove the results shown at the start of my question $\endgroup$
    – Data
    Commented May 5, 2018 at 17:21
  • $\begingroup$ Your first statement $\hat{\beta}_0=\bar{y}-\hat{\beta}_1$ is not true as you can see at $(2)$. Please post your question more careful. $\endgroup$ Commented May 5, 2018 at 18:40

1 Answer 1

1
$\begingroup$

Estimators are what parameters would be equal to if they met certain criteria that they probably only approximately do. Eq. (1) is equivalent to $\beta_0$ happening to have a value that minimizes OLS, but in theory it could have any value. The problem with the two hatless equations you propose is that they claim the $\beta_i$ have criterion-meeting values, but we don't actually know that's true. The hat-bearing quantities are estimators, which we define as the values the $\beta_i$ have if they meet the desired criteria.

Of course, (2) doesn't really give us an estimator because it contains $\beta_1$, which has unknown value. But once we similarly settle on the belief $\beta_1=\widehat{\beta}_1$ (make sure before going further that you understand on what criterion we take the value $S_x^{-2}S_{xy}$ seriously), we can turn the not-quite-helpful (2) into the helpful equation (3).

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .