Skip to main content
added 1 character in body
Source Link
Sextus Empiricus
  • 82.2k
  • 5
  • 113
  • 285

Ordinary linear regression

Regression coefficients are computed using a linear sum of the observations, you can write it as a matrix multiplication.

$$\hat{\beta} = M \cdot y$$

where $M = (X^TX)^{-1}X^T$. Therefore you will get

$$\hat{\beta}_{y_1} = M \cdot (y_2+y_3) = M \cdot y_2 + M \cdot y_3 = \hat{\beta}_{y_2} + \hat{\beta}_{y_3}$$

Other types of regression

For other types of regression this associativedistributive property does not need to hold.


Code demonstration

set.seed(1)

x = 1:10
y2 = rexp(10,1/x)
y3 = rexp(10,1/x)
y1 = y2 + y3

glm(y1 ~ x, family = Gamma)$coefficients
glm(y2 ~ x, family = Gamma)$coefficients +
glm(y3 ~ x, family = Gamma)$coefficients
#(Intercept)           x 
# 0.140085438 -0.007893727 
#(Intercept)           x 
# 0.62397301 -0.03852544 

lm(y1 ~ x)$coefficients
lm(y2 ~ x)$coefficients +
lm(y3 ~ x)$coefficients
#(Intercept)           x 
#  5.8173249   0.9380105 
#(Intercept)           x 
#  5.8173249   0.9380105 

Ordinary linear regression

Regression coefficients are computed using a linear sum of the observations, you can write it as a matrix multiplication.

$$\hat{\beta} = M \cdot y$$

where $M = (X^TX)^{-1}X^T$. Therefore you will get

$$\hat{\beta}_{y_1} = M \cdot (y_2+y_3) = M \cdot y_2 + M \cdot y_3 = \hat{\beta}_{y_2} + \hat{\beta}_{y_3}$$

Other types of regression

For other types of regression this associative property does not need to hold.


Code demonstration

set.seed(1)

x = 1:10
y2 = rexp(10,1/x)
y3 = rexp(10,1/x)
y1 = y2 + y3

glm(y1 ~ x, family = Gamma)$coefficients
glm(y2 ~ x, family = Gamma)$coefficients +
glm(y3 ~ x, family = Gamma)$coefficients
#(Intercept)           x 
# 0.140085438 -0.007893727 
#(Intercept)           x 
# 0.62397301 -0.03852544 

lm(y1 ~ x)$coefficients
lm(y2 ~ x)$coefficients +
lm(y3 ~ x)$coefficients
#(Intercept)           x 
#  5.8173249   0.9380105 
#(Intercept)           x 
#  5.8173249   0.9380105 

Ordinary linear regression

Regression coefficients are computed using a linear sum of the observations, you can write it as a matrix multiplication.

$$\hat{\beta} = M \cdot y$$

where $M = (X^TX)^{-1}X^T$. Therefore you will get

$$\hat{\beta}_{y_1} = M \cdot (y_2+y_3) = M \cdot y_2 + M \cdot y_3 = \hat{\beta}_{y_2} + \hat{\beta}_{y_3}$$

Other types of regression

For other types of regression this distributive property does not need to hold.


Code demonstration

set.seed(1)

x = 1:10
y2 = rexp(10,1/x)
y3 = rexp(10,1/x)
y1 = y2 + y3

glm(y1 ~ x, family = Gamma)$coefficients
glm(y2 ~ x, family = Gamma)$coefficients +
glm(y3 ~ x, family = Gamma)$coefficients
#(Intercept)           x 
# 0.140085438 -0.007893727 
#(Intercept)           x 
# 0.62397301 -0.03852544 

lm(y1 ~ x)$coefficients
lm(y2 ~ x)$coefficients +
lm(y3 ~ x)$coefficients
#(Intercept)           x 
#  5.8173249   0.9380105 
#(Intercept)           x 
#  5.8173249   0.9380105 
Bounty Ended with 50 reputation awarded by Kernel
added 596 characters in body
Source Link
Sextus Empiricus
  • 82.2k
  • 5
  • 113
  • 285

Ordinary linear regression

Regression coefficients are computed using a linear sum of the observations, you can write it as a matrix multiplication.

$$\hat{\beta} = M \cdot y$$

where $M = (X^TX)^{-1}X^T$. Therefore you will get

$$\hat{\beta}_{y_1} = M \cdot (y_2+y_3) = M \cdot y_2 + M \cdot y_3 = \hat{\beta}_{y_2} + \hat{\beta}_{y_3}$$

Other types of regression

For other types of regression this associative property does not need to hold.


Code demonstration

set.seed(1)

x = 1:10
y2 = rexp(10,1/x)
y3 = rexp(10,1/x)
y1 = y2 + y3

glm(y1 ~ x, family = Gamma)$coefficients
glm(y2 ~ x, family = Gamma)$coefficients +
glm(y3 ~ x, family = Gamma)$coefficients
#(Intercept)           x 
# 0.140085438 -0.007893727 
#(Intercept)           x 
# 0.62397301 -0.03852544 

lm(y1 ~ x)$coefficients
lm(y2 ~ x)$coefficients +
lm(y3 ~ x)$coefficients
#(Intercept)           x 
#  5.8173249   0.9380105 
#(Intercept)           x 
#  5.8173249   0.9380105 

Ordinary linear regression

Regression coefficients are computed using a linear sum of the observations, you can write it as a matrix multiplication.

$$\hat{\beta} = M \cdot y$$

where $M = (X^TX)^{-1}X^T$. Therefore you will get

$$\hat{\beta}_{y_1} = M \cdot (y_2+y_3) = M \cdot y_2 + M \cdot y_3 = \hat{\beta}_{y_2} + \hat{\beta}_{y_3}$$

Other types of regression

For other types of regression this associative property does not need to hold.

Ordinary linear regression

Regression coefficients are computed using a linear sum of the observations, you can write it as a matrix multiplication.

$$\hat{\beta} = M \cdot y$$

where $M = (X^TX)^{-1}X^T$. Therefore you will get

$$\hat{\beta}_{y_1} = M \cdot (y_2+y_3) = M \cdot y_2 + M \cdot y_3 = \hat{\beta}_{y_2} + \hat{\beta}_{y_3}$$

Other types of regression

For other types of regression this associative property does not need to hold.


Code demonstration

set.seed(1)

x = 1:10
y2 = rexp(10,1/x)
y3 = rexp(10,1/x)
y1 = y2 + y3

glm(y1 ~ x, family = Gamma)$coefficients
glm(y2 ~ x, family = Gamma)$coefficients +
glm(y3 ~ x, family = Gamma)$coefficients
#(Intercept)           x 
# 0.140085438 -0.007893727 
#(Intercept)           x 
# 0.62397301 -0.03852544 

lm(y1 ~ x)$coefficients
lm(y2 ~ x)$coefficients +
lm(y3 ~ x)$coefficients
#(Intercept)           x 
#  5.8173249   0.9380105 
#(Intercept)           x 
#  5.8173249   0.9380105 
Source Link
Sextus Empiricus
  • 82.2k
  • 5
  • 113
  • 285

Ordinary linear regression

Regression coefficients are computed using a linear sum of the observations, you can write it as a matrix multiplication.

$$\hat{\beta} = M \cdot y$$

where $M = (X^TX)^{-1}X^T$. Therefore you will get

$$\hat{\beta}_{y_1} = M \cdot (y_2+y_3) = M \cdot y_2 + M \cdot y_3 = \hat{\beta}_{y_2} + \hat{\beta}_{y_3}$$

Other types of regression

For other types of regression this associative property does not need to hold.