0
$\begingroup$

Suppose, we consider the following regression model, $$Y = X\beta + \varepsilon$$ where $\varepsilon$ ~ $N(0, \sigma^2V)$ and V is a known $n\times n$ non-singular, positive definite square matrix.

Now OLS estimator of $\beta$ is $\tilde \beta = (X'X)^{-1}X'Y$ whereas, the GLS estimator of $\beta$ is $\hat \beta = (X'V^{-1}X)^{-1}X'V^{-1}Y$.

Then I need to show that the variance of GLS estimator $V(\hat \beta)$ is smaller than the variance of OLS estimator $V(\tilde \beta)$. I was able to find that $V(\hat \beta) = \sigma^2(X'V^{-1}X)^{-1}$ and $V(\tilde \beta) = \sigma^2(X'X)^{-1}X'VX(X'X)^{-1}$

So, I am trying to show that, \begin{align} V(\hat \beta) - V(\tilde \beta) & = \sigma^2(X'X)^{-1}X'VX(X'X)^{-1} - \sigma^2(X'V^{-1}X)^{-1} \end{align} this difference is a positive definite matrix. That's where I am stuck. I have no idea how to do that. Can anybody help??

please note that I know that in this case, the GLS estimator is BLUE (Best Linear Unbiased Estimator) according to Gauss-Markov Theorem. So It has the minimum variance. But I need to prove this particular case directly.

$\endgroup$
2

0