0
$\begingroup$

I try to derive the information matrix equality for the Poisson distribution with the log-Likelihood:

$$\mathcal{L}(\lambda; x_1, x_2, \ldots, x_n) = \sum_{i=1}^{n} \left[-\lambda + x_i \log(\lambda) - \log(x_i!)\right]$$

The derivation by the Hessian matrix is clear to me and yield $I(\lambda) = \frac{n}{\lambda}$. So the goal is to show that

$$E\left[\sum_{i=1}^{n} \left(\frac{\partial \mathcal{L}_i(\lambda)}{\partial \lambda} \frac{\partial \mathcal{L}_i(\lambda)}{\partial \lambda^T}\right)\right] = E\{s(\lambda) \cdot s(\lambda)^T\} \overset{!}{=} \frac{n}{\lambda}$$

Now while the derivation with the above method on the left side of the equation yields the same outcome I am struggling showing that $ E\{s(\lambda) \cdot s(\lambda)^T\} = E\{s(\lambda)^2\}$ must also be the same. Since:

$s(\lambda)^2 = \left(\left(\sum_{i=1}^{n} x_i\right) / \lambda - n\right)^2 = \left(\sum_{i=1}^{n} x_i\right)^2 / \lambda^2 - 2n \left(\sum_{i=1}^{n} x_i\right) / \lambda + n^2$

And now this is where I am stuck, since $\left(\sum_{i=1}^{n} x_i\right)^2 \neq \left(\sum_{i=1}^{n} x_i^2\right) $, right? I think the rest would be like the left variant and I could easily derive it. So it's likely just this tiny bit missing.

$\endgroup$
1
  • $\begingroup$ The Poisson distribution has a single parameter why are you deriving the Hessian ?!? $\endgroup$
    – Ted Black
    Commented Mar 7 at 13:53

1 Answer 1

1
$\begingroup$

$\def\expect{\mathbb{E}}$ $\def\calL{\mathcal{L}}$ For a Poisson distribution the p.d.f. is, $$ f(y;\lambda)=e^{-\lambda} \frac{\lambda^y}{y!} $$ and $\expect[y]=\lambda$, $\expect[y^2]=\lambda(\lambda+1)$.

The second-derivative estimate in this case is, $$ I_{2D} =-\frac{1}{N}\expect \left[ \frac{\partial^2 \calL(\lambda) }{\partial \lambda^2} \right] $$ where $N$ is the sample size and $$ \calL(\lambda) = - \lambda N + \left(\sum_{n=1}^N y_n \right) \log\lambda - \sum_{n-1}^N \log y_n ! $$

Since, \begin{align*} \frac{\partial\mathcal{L}}{\partial\lambda} & = - N + \frac{\sum_{n=1}^N y_n}{\lambda} \\ \frac{\partial^2\mathcal{L}}{\partial\lambda^2} & = - \frac{\sum_{n=1}^N y_n}{\lambda^2} \end{align*} we have, $$ I_{2D} = \frac{1}{N} \frac{\expect\left[\sum_{n=1}^N y_n \right]}{\lambda^2} =\frac{1}{\lambda} $$

The outer product estimate in this case is, $$ I_{OP} = \frac{1}{N} \expect\left[ \sum_{n=1}^N \left( \frac{\partial\log f(y_n;\lambda)}{\partial \lambda} \right)^2 \right] = \frac{1}{N} \expect\left[ \sum_{n=1}^N \left( 1 - \frac{y_n}{\lambda} \right)^2 \right] $$ This expression can be simplified to, $$ I_{OP} = - 1 +\lambda^{-2} \frac{1}{N} \expect\left[\sum_{n=1}^N y^2_n \right] $$ Since, $$ \frac{1}{N} \expect\left[ \sum_{n=1}^N y^2_n \right]= \lambda (\lambda + 1) $$ we have, $$ I_{OP} = - 1 +\lambda^{-2} \lambda(\lambda+1 ) = \frac{1}{\lambda} $$ Hence $I_{2D}=I_{OP}$.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .