I have this question:
Show that the ridge estimator of $β$, denoted $\hat{\beta}_{Ridge}$ can be obtained as the solution to the constraint optimization problem:
Minimize w.r.t. ${\beta}$
$[({\beta}-\hat{\beta}_{LS})'X'X({\beta}-\hat{\beta}_{LS})]$ subject to $\beta'\beta≤d$
Where $\beta_{LS}$ is the ordinary least-squares estimator of $\beta$ and $d>0$ is an ordinary constant. I noted the existence of this thread https://stats.stackexchange.com/questions/69205/how-to-derive-the-ridge-regression-solution, however it don't really formulate the problem in the same way.
Ok, so what I got is this:
$({\beta}-\hat{\beta}_{LS})'X'X({\beta}-\hat{\beta}_{LS}) + \lambda \beta' \beta=\beta' X'X \beta - \beta' X'X \hat{\beta}_{LS} - \hat{\beta}_{LS}' X'X \beta + \hat{\beta}_{LS}' X'X \hat{\beta}_{LS} + \beta' \lambda I \beta = \beta' X'X \beta - 2\beta' X'X \hat{\beta}_{LS} + \hat{\beta}_{LS}' X'X \hat{\beta}_{LS} + \beta' \lambda I \beta $
Since $\hat{\beta}_{LS} = (X'X)^{-1}X'y$ we get:
$\beta' X'X \beta -2\beta'X'X(X'X)^{-1}X'y + y'X(X'X)^{-1}X'X(X'X)^{-1}X'y + \beta' \lambda I \beta = \beta'X'X\beta -2\beta'X'y + y'X(X'X)^{-1}X'y + \beta' \lambda I \beta$
But here is where I get stuck. I want $y'X(X'X)^{-1}X'y$ to be equal to $y'y$ but I don't know how.
Please MathStack help me. You are my only hope.
P.S. $\lambda$ is a scalar.