I want to prove This formula:
The score function is basically the derivative of the maximum likelihood's log, so to get the information I make another derivative of that:
$$ -E[∂/∂θ s(X;θ)] = -E[∂/∂θ (∂log f(X;θ) / ∂θ)] = -E[∂^2/∂θ^2 log f(X;θ)] $$
Now I did $$ -E[∂^2/∂θ^2 log f(X;θ)] = -E[(∂^2/∂θ^2 f(X;θ)) / f(X;θ)] + E[(∂/∂θ f(X;θ)/f(X;θ))^2] $$
I did integral of $-E[(∂^2/∂θ^2 f(X;θ)) / f(X;θ)]$ which aventualy equals to zero, which leaves us with
$$ -E[∂^2/∂θ^2 log f(X;θ)] = 0 + E[(∂/∂θ f(X;θ)/f(X;θ))^2] $$
And since $$ E[(∂/∂θ f(X;θ)/f(X;θ))^2] = E[s(X;θ)^2] $$
This solves the proof.
What do you think?