4
$\begingroup$

You have a sample of $n$ i.i.d. realizations of the random variable $X$ distributed as a Poisson with parameter $\lambda$. It is known that:

  • $n_1$ values are greater than or equal to $2$;
  • $n_2$ values are equal to $1$;
  • The remaining observations are equal to $0$.

a. Determine the maximum likelihood estimator for the mean of the Poisson distribution.

Solution:

The Poisson distribution is defined by the probability mass function: $$ P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!} $$

The probabilities for each category are:

  1. Probability of observing $X = 0$: $$ P(X = 0) = e^{-\lambda} $$

  2. Probability of observing $ X = 1 $: $$ P(X = 1) = \lambda e^{-\lambda} $$

  3. Probability of observing $ ( X \geq 2 ): $ $$ P(X \geq 2) = 1 - P(X = 0) - P(X = 1) = 1 - e^{-\lambda} - \lambda e^{-\lambda} $$

Given $ n_1 $ values are greater than or equal to $ 2 $, $ n_2 $ values are $ 1 $, and the rest are $ 0 $, the likelihood function is: $$ L(\lambda) = (e^{-\lambda})^{n - n_1 - n_2} (\lambda e^{-\lambda})^{n_2} \left(1 - e^{-\lambda} - \lambda e^{-\lambda}\right)^{n_1} $$

The log-likelihood function $\ell(\lambda) $ simplifies to: $$ \ell(\lambda) = -(n - n_1 - n_2)\lambda + n_2 (\log \lambda - \lambda) + n_1 \log\left(1 - e^{-\lambda} - \lambda e^{-\lambda}\right) $$

To find the maximum likelihood estimator, differentiate $\ell(\lambda) $ with respect to $ \lambda $ and set the derivative equal to zero: $$ \frac{d\ell(\lambda)}{d\lambda} = -(n - n_1 - n_2) + \frac{n_2}{\lambda} - n_2 + n_1 \frac{\lambda e^{-\lambda}}{1 - e^{-\lambda} - \lambda e^{-\lambda}} $$

Calculating the final step is challenging (assuming the preceding steps are correct), and I am uncertain how to resolve this exercise

$\endgroup$
3
  • $\begingroup$ Do you have a specific ${n}, {n}_{1}, {n}_{2}$? $\endgroup$
    – Royi
    Commented Jun 28 at 18:17
  • $\begingroup$ @Royi The question in the text referred to finding the estimator, not the estimate. At this point, I believe the text is "incorrect" because the other questions are also related to the definition of the estimator in general $\endgroup$
    – Emalas
    Commented Jun 28 at 20:43
  • $\begingroup$ The estimator cannot be found explicitly, but can be proven to exist. $\endgroup$
    – Amir
    Commented Jun 29 at 6:18

3 Answers 3

1
$\begingroup$

The critical point $\hat \lambda$ satisfies $$n - n_1 = \frac{n_2}{\lambda} + \frac{n_1 \lambda}{e^\lambda - (1 + \lambda)}$$ or equivalently, $$e^\lambda = \frac{n_2 - (n - n_1 - n_2)\lambda - n \lambda^2}{n_2 - (n - n_1)\lambda}. \tag{1}$$ For most permissible choices of $n, n_1, n_2$ and a suitably large initial guess $\lambda_0$, the recursion $$\lambda_{m+1} = \log \frac{n_2 - (n - n_1 - n_2)\lambda_m - n \lambda_m^2}{n_2 - (n - n_1)\lambda_m} \tag{2}$$ converges to $\hat \lambda$. For instance, the sample $(n, n_1, n_2) = (20, 5, 3)$ with the initial choice $\lambda_0 = 1$ yields $$\begin{array}{c|c} m & \lambda_m \\ \hline 0 & 1. \\ 1 & 0.882389 \\ 2 & 0.81657 \\ 3 & 0.777971 \\ 4 & 0.754727 \\ 5 & 0.74051 \\ 6 & 0.731734 \\ 7 & 0.726286 \\ 8 & 0.722891 \\ 9 & 0.720773 \\ 10 & 0.719448 \\ \vdots & \vdots \\ 25 & 0.717234 \\ \end{array}$$ Granted, the convergence is rather slow. We can accelerate the convergence by instead computing the Newton's method recursion for the root of $(1)$: $$\lambda_{m+1} = \lambda_m + \frac{\left(\left(n_1 - n\right) \lambda_m + n_2\right) \left(n_2 \left(\lambda_m - e^{\lambda_m} + 1\right) - \lambda_m \left(n \lambda_m + \left(n_1 - n\right) e^{\lambda_m} + n - n_1\right)\right)}{-n^2 \lambda_m^2 + 2 n n_2 \lambda_m + n n_1 \lambda_m^2 + e^{\lambda_m} \left(\left(n_1 - n\right) \lambda_m + n_2\right)^2 - n_2^2}. \tag{4}$$ For the same initial choice and sample, we obtain $$\begin{array}{c|c} m & \lambda_m \\ \hline 0 & 1. \\ 1 & 0.785447 \\ 2 & 0.722392 \\ 3 & 0.717264 \\ 4 & 0.717232 \\ 5 & 0.717232 \end{array}$$

$\endgroup$
0
$\begingroup$

You have a univariate function which is bounded in its valid range.

  • For $\lambda \to 0$ it goes to infinity.
  • For $\lambda \to \infty$ the function goes to $-n + {n}_{1}$ which is negative.

Show it is monotonic decreasing and then, given the parameters $n, {n}_{1}, {n}_{2}$ the problem is quite easy to solve.

$\endgroup$
0
$\begingroup$

Defining $n_3=n - n_1 - n_2$, the log-likelihood function can be written as $$ \ell(\lambda) = -(n_2+n_3)\lambda + n_2 \log \lambda + n_1 \log\left(1 - e^{-\lambda} - \lambda e^{-\lambda}\right). $$

It goes to $-\infty$ as $\lambda \to 0$ or $\lambda \to +\infty$, so it has a global maximizer in $(0,\infty)$ (it can be shown that $ \ell(\lambda)$ is a concave function.) However, this maximizer cannot be found explicitly. Hence, you cannot find maximum likelihood estimator of $\lambda$, which is a statistic, explicitly. However, for any given values of $n_1,n_2,n_3$ you can find maximum likelihood estimate of $\lambda$, which is an observation of the maximum likelihood estimator, by solving the equation you derived in the OP numerically.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .