7
$\begingroup$

If X is U[$0$,$\theta$], then the likelihood is given by $f(X,\theta) = \dfrac{1}{\theta}\mathbb{1}\{0\leq x \leq \theta\}$. The definition of Fisher information is $I(\theta) = \mathbb{E} \left[ \left(\dfrac{d \log(f(X,\theta))}{d\theta} \right)^2 \right]$. How can this be calculated when $\log f(X,\theta) $ is not defined for $\theta < X$? I understand that we also have $f(X,\theta) = 0$ for $\theta < X$ but can we ignore this when taking the expectation? If so, why?

$\endgroup$
5
  • 2
    $\begingroup$ I'm not sure, but I think one chooses to define the log of the density only on the support of the density. $\endgroup$
    – Shashi
    Commented Nov 13, 2017 at 22:41
  • $\begingroup$ I think that makes sense. I suppose we can see the random variable $X$ as a function from $X: \Omega \rightarrow [0,\theta]$, in which case $\log f(X,\theta)$ is well defined. Does that work? $\endgroup$
    – bri
    Commented Nov 15, 2017 at 10:48
  • $\begingroup$ Yes that is one thing you can do. $\endgroup$
    – Shashi
    Commented Nov 15, 2017 at 13:44
  • 1
    $\begingroup$ See this answer for why Fisher information is not defined here in the usual sense. $\endgroup$ Commented May 15, 2020 at 16:17
  • $\begingroup$ Thanks for the pointer. So as I've defined it, $I(\theta)$ does exist but it is not an 'interesting' quantity for the reasons outlined in that answer. $\endgroup$
    – bri
    Commented May 18, 2020 at 8:31

1 Answer 1

-4
$\begingroup$

it is $n^2/\theta$.

We get this from calculating the log-likelihood first which is $-n \log(\theta)$, then taking its derivative, we will get $\frac{-n}{\theta}$. squaring it and take its expectations we will have $\frac{n^2}{\theta^2}$

$\endgroup$
5
  • 4
    $\begingroup$ Fisher information does not exist for distributions with parameter-dependent supports. Using different formulae for the information function, you arrive at different answers. $\endgroup$ Commented Mar 21, 2019 at 8:30
  • $\begingroup$ @StubbornAtom Can you give an example of the argument you just gave? $\endgroup$ Commented Apr 16, 2019 at 18:24
  • $\begingroup$ @DanielOrdoñez Fisher information is defined for distributions under some 'regularity conditions'. One of the conditions is that support of distribution should be independent of parameter. That is the main argument... $\endgroup$ Commented Apr 16, 2019 at 18:50
  • $\begingroup$ Yes, clearly it happens because you can't change the integral and the differential in this case. But isn't the Fisher Information defined as the first expression you gave? (the second one being a corollary when you can switch the differential and the integral) $\endgroup$ Commented Apr 16, 2019 at 18:58
  • $\begingroup$ @DanielOrdoñez That is correct. The problem is that the information is defined under assumptions not all of which hold here. (use @ while replying so that we get pinged) $\endgroup$ Commented Apr 16, 2019 at 21:22

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .