0
$\begingroup$

Let $X$ be a random draw from an Exponential($\theta$) distribution, which has a density $p_\theta(x)=\theta^{-1}e^{-x/\theta}$ with an unknown $\theta>0.$

a) Compute the Crámer-Rao lower bound for unbiased estimation of $g(\theta)=\theta^k, k\in N$.

b) Find the UMVUE of $g(\theta)=\theta^k, k\in N$.

c) Compute the variance of the UMVUE from b). For which $k\in N$, if any, does the UMVUE attain the CR-bound from a)?

My idea: for a), the Crámer-Rao lower bound would be (I think) $\frac{g'(\theta)^2}{I(\theta)} $ with $I(\theta)=-E_{\theta}[l''_{\theta}(X)]=1/\theta^2$ (here $l_\theta$ is MLE) so one would get $\frac{g'(\theta)^2}{I(\theta)} = \frac{(k \theta^{k-1})^2}{1/\theta^2}=k^2\theta^{2k} $. I'm not sure though if this is correct.

As for b) my first guess was to take a look at moment k of X and $E[X^k]=k!\theta^k$. I wanted to apply Lehmann-Scheffé Lemma, but I had no idea how to do that because of $k!$

$\endgroup$

1 Answer 1

1
$\begingroup$

a)

I'm not sure though if this is correct.

If you have a single observation yes it is! If you have a size-$n$ random sample you get

$$V(T)\geq \frac{k^2\theta^{2k}}{n}$$

b) Observe that $T=\overline{X}_n$ is unbiased estimator for $\theta$ and it is a function of $S=\Sigma_iX_i$, complete and sufficient statistics...thus T is UMVUE for $\theta$

Given this, one may suspect that $[\overline{X}_n]^k$ is UMVUE for $\theta^k$. To verify this it can be useful to calculate the expectation of

$$T_k=[\Sigma_iX_i]^k$$

This is easy because $\Sigma_iX_i$ has a known distribution (it's a gamma) and $T_k$ is function of $S$, complete and sufficient statistics.

Thus all you have to do, once you have calculated the expectation of $T_k$ is to correct, if necessary, its bias

$\endgroup$
3
  • $\begingroup$ So if I understood it correctly, since $T$ is UMVUE for $\theta$ and $E[X^k]=k!\theta ^k$ would it suffice to define $T_{new}= T^k/k!$ to solve bias? $\endgroup$
    – Alex
    Commented Apr 7, 2021 at 13:52
  • $\begingroup$ @AlexTubone : given that you have a single observation, yes. Anyway I explained you how to do in case of a $n$ sized random sample too $\endgroup$
    – tommik
    Commented Apr 7, 2021 at 14:01
  • $\begingroup$ Adding to this then for c) one would have $$Var(X^k/k!)=(1/k!)^2 (E[X^{2k}]-E[X^k]^2)$$ $$= (2k)(2k-1)...(k+1)\theta^{2k}/k!- \theta^{2k}$$ this is also just equal to the result in a) if k=1, right? $\endgroup$
    – Alex
    Commented Apr 7, 2021 at 14:07

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .