2
$\begingroup$

I am studying on UMVUE, and I'm struggling to find that conditional expectation

Let $X_1,\ldots,X_n$ random sample of $X\sim U[0,\theta]$. i) Show that $2X_1$ is a unbiased estimator for $\theta$ and use the Rao-BlackWell Theorem for found the UMVUE for $\theta$.

ii)Calculate $E[X_{(n)}]$ and explicitly find UMVUE for $\theta$

I already show that $2X_1$ is unbiased and also found that $X_{(n)}=\max(X_1,\ldots,X_n)$ is a complete and sufficient statistic for $\theta$, but I am having trouble finding the conditional, how I can calulate $$E[2X_1\mid X_{(n)}]=2E[X_1\mid X_{(n)}]$$ How do I calculate the conditional distribution and the expectation?

$\endgroup$
5
  • 2
    $\begingroup$ None of this is necessary to compute the UMVUE $\hat\theta_n$ since one knows that $\hat\theta_n=X_{(n)}/x_n$, with $E(X_{(n)})=x_n\theta$. Can you compute $x_n$? (The conditional distribution of $X_1$ conditionally on $X_{(n)}$ happens to be an interesting beast, being partially discrete and partially absolutely continuous, but it is not needed to compute $\hat\theta_n$.) $\endgroup$
    – Did
    Commented May 19, 2015 at 21:15
  • $\begingroup$ @Did I'm having difficulty understanding when I need to find the conditional and when it is not needed. I'll edit the exercise because it has two items $\endgroup$
    – Roland
    Commented May 19, 2015 at 21:24
  • 1
    $\begingroup$ I've posted an answer that does it in the pedestrian way. Now let me take a shot at understanding "Did"'s way. If you can show that $X_{(n)}$ is complete and sufficient, and that $\operatorname{E}(X_{(n)})=(n-1)\theta/n$, then you can find and estimator that depends on the data only through $X_{(n)}$ and that is unbiased. Since it's complete, sufficient, and unbiased, the Lehmann–Scheffé theorem theorem tells you it's the best unbiased estimator. ${}\qquad{}$ $\endgroup$ Commented May 19, 2015 at 21:32
  • $\begingroup$ math.stackexchange.com/questions/261530/finding-ex-1-max-x-i $\endgroup$ Commented May 6, 2019 at 10:33
  • $\begingroup$ math.stackexchange.com/q/60497/321264, math.stackexchange.com/q/2941489/321264 $\endgroup$ Commented Jun 12, 2020 at 19:05

1 Answer 1

1
$\begingroup$

$\newcommand{\E}{\operatorname{E}}$I've always found this particular problem mildly icky in the way in which it mixes the discrete and the continuous and apparently has to be done piecewise.

You need \begin{align} & \E(X_1\mid X_{(n)}) \\[10pt] = {} & \E(\E(X_1 \mid X_1=X_{(n)})\mid X_{(n)})\Pr(X_1=X_{(n)}\mid X_{(n)}) \\[2pt] & {} + \E(\E(X_1 \mid X_1\ne X_{(n)})\mid X_{(n)})\Pr(X_1\ne X_{(n)}\mid X_{(n)}) \end{align}

Observe that $\Pr(X_1=X_{(n)}) = 1/n$ because every one of the observations is equally likely to be the maximum, and $\E(\E(X_1\mid X_1=X_{(n)})\mid X_{(n)}) = X_{(n)}$. So you get $$ \frac{X_{(n)}}n + \frac{n-1} n \E(\E(X_1 \mid X_1\ne X_{(n)})\mid X_{(n)}). $$ If you can show that $\E(\E(X_1 \mid X_1\ne X_{(n)})\mid X_{(n)})= X_{(n)}/2$ then you get $$ X_{(n)}\cdot\left( \frac 1 n + \frac{n-1}{2n} \right) = X_{(n)} \cdot\frac{n+1}{2n} $$ and your best unbiased estimator will be $2$ times that.

$\endgroup$
2
  • $\begingroup$ When you have a sufficient and complete statistics, it is better to use Lehmann-Scheffé theorem, calculate the expectation of statistics and manipulate it to find the UMVUE, than finding the conditional? $\endgroup$
    – Roland
    Commented May 19, 2015 at 21:50
  • $\begingroup$ It's simpler in cases where it can be done. $\endgroup$ Commented May 19, 2015 at 22:16

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .