I know from this question that $\sum_{i=1}^{n}X_i$ is a sufficient estimator for $\lambda$ in the Poisson distribution. However, from looking at the proof I can see that $\frac{1}{n}\sum_{i=1}^{n}X_i$ can also be a sufficient estimator. Is this always a case? i.e. the sufficient estimator is not unique.
2 Answers
The linked question is talking about a sufficient statistic. That is something that gives you all the information you need to calculate an estimator - it is not necessarily an estimator itself.
In particular $\sum_{i=1}^n X_i$ and $\frac 1n \sum_{i=1}^n X_i$ give exactly the same information, since you can convert from one to the other by multiplying or dividing by $n$. So both are sufficient statistics, and for that matter so would be any non-constant linear function of $\sum_{i=1}^n X_i$.
But if you want an estimator for $\lambda$, you need to use $\frac 1n \sum_{i=1}^n X_i$.
-
$\begingroup$ So $ \sum_{i=1}^n X_i$ is a sufficient statistic while $\frac 1n \sum_{i=1}^n X_i$ is a sufficient estimator? $\endgroup$– gbdCommented Jun 13 at 17:32
-
$\begingroup$ @gbd there's no such thing as "sufficient estimator". It's a maximum likelihood estimator (MLE). The MLE can always be calculated purely from a sufficient statistic - that's what it's sufficient for. $\endgroup$ Commented Jun 14 at 8:09
Yes it is, because multiplying by a constant is a biyective transformation. Check Fisher–Neyman factorization theorem and the proof follows almost directly: https://en.wikipedia.org/wiki/Sufficient_statistic#Fisher%E2%80%93Neyman_factorization_theorem