1
$\begingroup$

I know from this question that $\sum_{i=1}^{n}X_i$ is a sufficient estimator for $\lambda$ in the Poisson distribution. However, from looking at the proof I can see that $\frac{1}{n}\sum_{i=1}^{n}X_i$ can also be a sufficient estimator. Is this always a case? i.e. the sufficient estimator is not unique.

$\endgroup$
1
  • $\begingroup$ $1/n$ is not a parameter of the distribution, so you're fine $\endgroup$ Commented Jun 13 at 15:22

2 Answers 2

2
$\begingroup$

The linked question is talking about a sufficient statistic. That is something that gives you all the information you need to calculate an estimator - it is not necessarily an estimator itself.

In particular $\sum_{i=1}^n X_i$ and $\frac 1n \sum_{i=1}^n X_i$ give exactly the same information, since you can convert from one to the other by multiplying or dividing by $n$. So both are sufficient statistics, and for that matter so would be any non-constant linear function of $\sum_{i=1}^n X_i$.

But if you want an estimator for $\lambda$, you need to use $\frac 1n \sum_{i=1}^n X_i$.

$\endgroup$
2
  • $\begingroup$ So $ \sum_{i=1}^n X_i$ is a sufficient statistic while $\frac 1n \sum_{i=1}^n X_i$ is a sufficient estimator? $\endgroup$
    – gbd
    Commented Jun 13 at 17:32
  • $\begingroup$ @gbd there's no such thing as "sufficient estimator". It's a maximum likelihood estimator (MLE). The MLE can always be calculated purely from a sufficient statistic - that's what it's sufficient for. $\endgroup$ Commented Jun 14 at 8:09
1
$\begingroup$

Yes it is, because multiplying by a constant is a biyective transformation. Check Fisher–Neyman factorization theorem and the proof follows almost directly: https://en.wikipedia.org/wiki/Sufficient_statistic#Fisher%E2%80%93Neyman_factorization_theorem

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .