1
$\begingroup$

I am attempting to learn how to find a complete and sufficient statistic. So, I am working on this problem for class:

Let $X_1, \cdot\cdot\cdot,X_n$ be a random sample from the pdf $f(x_i|u)=e^{-(x-\mu)}$, where $- \infty < \mu < x_i <\infty$. Show that $X_{1} = min_i\{X_i\}$ is a complete sufficient statistic.

Here is what I have done so far. First I tried to show it is sufficient by factorization theorem:

  • $\prod_{i=1}^{n} f(x_1,...,x_n|\theta) = exp^{-\sum_{i=1}^{n}x_i+n\mu} I\{\mu<x_{(1)}<\infty\}$.

  • $g(T(X|\theta)=exp^{-\sum_{i=1}^{n}x_i+n\mu}$

  • $h(X) = 1$.

First why is $min_i\{X_i\}$ a sufficient in this case? I know that for a statistic to be complete it must also satisfy the condition that for all $g(.)$ the expectation of $E[g(T)]=0$. and this should happen with probability 1. So then I must take the expectation of a function of T to show the completeness, but I am not sure if how I proceeded is correct. Thanks in advance!

$\endgroup$

1 Answer 1

0
$\begingroup$

Sufficiency: The minimum is a sufficient statistic by the Neyman Factorization Theorem. $g_\mu$ has $\mu$ interacting only with the sufficient statistic being the minimum of the order statistics.

$$ \begin{split} f_\mu(x) &= e^{n\mu-\sum_ix_i}\prod_iI[\mu<x_i<\infty] \\ & = e^{n\mu-\sum_ix_i}\prod_i I[x_{(1)}\leq x_i<\infty] \\ & = e^{-\sum_i x_i} e^{n\mu}\prod_i I[x_{(1)}\leq x_i<\infty] \\ & = h(x)g_\mu(x_{(1)}) \end{split} $$

Completeness: As you've written, you need to show that for $\mathbb{E}[g(T)]=0$, then $\forall\mu:\mathbb{P}_\mu(g(T)=0)=1$. We will use the pdf for the minimum of a set of $n$ iid random variables.

$$ p(t)=\frac{d}{dt}(1-(1-F(t))^n) = ne^{-n(t-\mu)} $$

$$ \begin{split} \mathbb{E}_\mu[g(T)] & = \int_\mu^\infty g(t)p(t)dt \\ & = \int_\mu^\infty g(t)ne^{-n(t-\mu)}dt = 0 \end{split} $$

It remains to show

$$ n\int_\mu^\infty g(t)e^{-n(t-\mu)}dt = 0\equiv \int_\mu^\infty g(t)e^{-nt}dt = 0 $$

$$ \frac{d}{d\mu}\int_\mu^\infty g(t)e^{-nt}dt =\frac{d}{d\mu} 0\rightarrow g(\mu)e^{-n\mu}=0 $$

This needs to hold for any value of $\mu$, and hence, we conclude that $g=0$ in all cases.

$\endgroup$
1
  • $\begingroup$ Thanks, I was able to get this from my class book, and I missing understanding how to obtain the pdf of ordered statistics. Thanks for your comment. $\endgroup$
    – Harry Lofi
    Commented Mar 1 at 15:23

Not the answer you're looking for? Browse other questions tagged or ask your own question.