1
$\begingroup$

Consider the simple linear regression model$$y=\beta_0+\beta_1X+e,$$ We observe a sample of n sets of observations $(x_i,y_i)(i=1,2,\cdots,n)$, then we can write $$y_i=\beta_0+\beta_1x_i+e_i,$$ where $$e_i \sim N(0,\sigma^2),i.i.d$$ Using the least squares method, we obtain the estimator of $\beta_0$ and $\beta_1$and define them as $b_0$ and $b_1$.

i.e. $$ b_1=\frac{S_{xy}}{S_{xx}} = \frac{\sum_{i=1}^n(\bar{x}-x_i)(\bar{y}-y_i)}{\sum_{i=1}^n(\bar{x}-x_i)^2}\\ b_0=\bar{y}-b_1\bar{x},$$ with $\bar{x}=\sum_{i=1}^nx_i$ and $\bar{y}=\sum_{i=1}^ny_i$.

The residual sum of squares of the model can be defined as $$RSS=\sum_{i=1}^n \hat{e}^2=\sum_{i=1}^n(y_i-(b_0+b_1x_i))^2$$

In this case, why are the residual sum of squares and $b_1$(or $b_0$) independently distributed?

When I read about the confidence interval estimation of the simple linear regression model when $\sigma^2$ is unknown(So we use the t-statistic), I met this problem.

$\endgroup$
1
  • $\begingroup$ Welcome to math.SE: since you are new, I wanted to let you know a few things about the site. In order to get the best possible answers, it is helpful if you say in what context you encountered the problem, and what your thoughts on it are; this will prevent people from telling you things you already know, and help them give their answers at the right level. Also, many find the use of imperative ("Prove", "Solve", etc.) to be rude when asking for help; please consider rewriting your post. $\endgroup$ Commented Mar 7, 2020 at 6:04

1 Answer 1

1
$\begingroup$

They are not independent. The RSS is a function of the coefficients of the regression line, and we choose $b_1$ and $b_0$ such that they minimize the RSS.

$\endgroup$
1
  • $\begingroup$ Are you sure? Many books say that RSS (or SSE) is independent of both $b_0$ and $b_1$. $\endgroup$
    – ashpool
    Commented Nov 30, 2023 at 13:16

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .