1
$\begingroup$

What is the standard error of the coefficient in a linear regression model performed by a $\chi^2$ regression, without an intercept present?

I've determined $b$ and $\sigma_b$ as follows:

We want to fit a set of data points $(x_i, y_i)$ to the model $y = b x$. To find $b$ set $\frac{d \chi^2}{d b} = 0$

$$ 0 = \frac{d \chi^2}{d b} = \frac{d}{d b} \sum _i \frac{(y_i - b x_i)^2}{\sigma_i^2} = -\sum_i 2 x_i \frac{y_i - b x_i}{\sigma_i^2} $$

$$ b = \frac{\sum _i \frac{x_i y_i}{\sigma_i^2}}{\sum_i \frac{x_i^2}{\sigma_i^2}} := \frac{S_{xy}}{S_{xx}} $$

To find $\sigma_b$ we must perform error propagation and set $\sigma_b^2 = \sum_i \sigma_i^2 \left( \frac{\partial b}{\partial x_i} \right)^2$

Using the quotient rule

$$ \frac{\partial b}{\partial x_i} = \frac{y_i S_{xx} - 2 x_i S_{xy}}{S_{xx}^2 \sigma_i^2} $$

$$ \sigma_i^2 \left(\frac{\partial b}{\partial x_i} \right)^2 = \frac{y_i^2 S_{xx}^2 - 4 x_i y_i S_{xx} S_{xy} + 4 x_i^2 S_{xy}^2}{S_{xx}^4 \sigma_i^2} $$

$$ \sum _i \sigma_i^2 \left(\frac{\partial b}{\partial x_i} \right)^2 = \frac{S_{yy}}{S_{xx}^2} - 4 \frac{S_{xy}^2}{S_{xx}^3} + 4 \frac{S_{xy}^2}{S_{xx}^3} = \frac{S_{yy}}{S_{xx}^2} $$

From googling I'm quite sure that my formula $b = \frac{S_{xy}}{S_{xx}}$ is correct.

Is $\sigma_b^2 = \frac{S_{yy}}{S_{xx}^2}$ correct?

Thank you ! :):):)

edit: The reason I ask is because I used that formula on my data, and it basically fails the sanity check of being similar to the uncertainty of the quantity $b$ as derived differently, using a simple $\chi^2$ weighted average and standard deviation on the formula $b = y/x$.

$\endgroup$
3
  • $\begingroup$ What is a '$\chi^2$ linear regression'? $\endgroup$ Commented Feb 10, 2020 at 18:26
  • $\begingroup$ What I mean by that is a linear regression in which points are not weighted equally, but according to their known (possibly relative) uncertainties. That is, for each sample $(x_i, y_i, \sigma_i)$ we define $\chi_i=(y_i-f(x_i))/\sigma_i$ where, in this case of a linear regression through the origin $f(x)=bx$ with one parameter $b$. (For a general linear regression that is not constrained to go through the origin, $f$ is a two-parameter function $f(x)=a+bx$). To solve for the parameter(s) of the model, we solve a system of equations in which we set $d \chi^2/dP=0$ for each parameter $P$. $\endgroup$ Commented Feb 11, 2020 at 1:11
  • $\begingroup$ I'm not really sure how to interpret the $\sigma_b$ for my model - maybe it's normal for it to have a far higher, or a far lower, uncertainty than the weighted mean $\mu_b$ and standard error of weighted mean $\sigma_{\mu_b}$ statistics. (Those are computed by - instead of writing $y = b x$ - writing $b = y/x$, and simply setting $b$ equal to a weighted mean of the $y_i/x_i$ values over our sample set $(x_i, y_i, \sigma_i)$ weighted by the errors assigned the measurements.) So many options in statistics! I just don't know which to choose. $\endgroup$ Commented Feb 11, 2020 at 1:18

1 Answer 1

1
$\begingroup$

Asked my professor, and he said I was actually supposed to take $\frac{\partial b}{\partial y_i}$ as my partial derivative, which gives

$$\sigma_b^2 = \frac{1}{S_{xx}}$$

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .