Matrix regression proof that $\hat \beta = (X' X)^{-1} X' Y = {\hat \beta_0 \choose \hat \beta_1} $
where $\beta$ is the least square estimator of $\hat\beta$ of $\beta$
attempt
So I know ${\hat \beta_0 \choose \hat \beta_1} = {\overline{Y} - \hat \beta_1 \overline{X} \choose \frac{\sum_{i=1}^{n} (X_i - \overline{X})(Y_i - \overline{X})}{\sum_{i=1}^{n}(X_i - \overline{X})^2}}$
Not really sure how to start as I don't know what formulas there are to reduce any of this. And if this was answered elsewhere please duplicate I was trying to search but couldn't