2
$\begingroup$

I need to prove that given $(X_1,...,X_n)$ from the density $$\frac{1}{\theta}x^{\frac{1}{\theta}-1}1_{(0,1)}$$ no efficient estimator exists for $g(\theta)$=$\frac{1}{{\theta}+1}$. I have shown that the MLE for $\theta$ is $-\frac{\sum{\ln(X_i)}}{n}$ and is Gamma distributed with variance $\frac{n}{\theta^2}$. So is unbiased and efficient for $\theta$. I have also computed the lower limit for the variance of any estimator of $g(\theta)$ that is $\frac{\theta^2}{n(\theta+1)^4}$. I really don't know how to prove that this limit can't be reached by any estimator of $g(\theta). $

Any hint would be very appreciated.

$\endgroup$
4
  • $\begingroup$ Recall the equality condition of Cramer-Rao inequality. $\endgroup$ Commented Jul 3 at 7:02
  • $\begingroup$ @StubbornAtom that is when the score function decomposes as $(T_n-\theta)I_n(\theta)$. Does it hold for the parameter or also for functions of the parameter? In other words what is the score function associated with $g(\theta)$? $\endgroup$ Commented Jul 3 at 7:27
  • 1
    $\begingroup$ It tells you exactly which functions of $\theta$ admit estimators whose variances attain the Cramer-Rao bound. $\endgroup$ Commented Jul 3 at 7:38
  • $\begingroup$ @StubbornAtom can you explain better how can I obtain the form of functions of the parameter that can be estimated optimally? $\endgroup$ Commented Jul 3 at 9:34

0