Skip to main content

Questions tagged [fisher-information]

For question about fisher information that appears in mathematical statistics.

0 votes
0 answers
8 views

Parametric and non parametric probability distributions

Non-parametric Statistical Models on Finite and Infinite Measure Spaces Consider the following sets of probability densities: In Amari it says Consider a family $S$ of probability distributions on $X$...
Andyale's user avatar
  • 181
1 vote
0 answers
41 views

Fisher's information for a function that consist in many indicator functions

I have the following pdf: $$ f(x) = \theta I_{(-\frac{1}{2},0]}+ I_{(0,\frac{1}{2}]}+(1-\theta) I_{(\frac{1}{2},1]} $$ I've tried the following \begin{align} I(\theta) &=-E[\frac{d^2}{d\...
DEMB's user avatar
  • 79
0 votes
0 answers
34 views

Trigamma-free Negative Binomial regression: doubts on Hessian and Fisher Information Matrix in the dispersion parameter

I have been looking at alternative versions of the Hessian and Fisher (expected) Information Matrix for the Negative Binomial regression specification, which are given by widely-cited academic sources ...
DrEti's user avatar
  • 63
0 votes
0 answers
19 views

Calculation of the projection w from Linear Discriminant Analysis

In an assigment focused on Linear Discriminant Analysis(LDA) there is this theoritical exercise: A dataset has been derived from two classes $ \omega_A$ and $\omega_B$, the distributions of which are ...
Constantinos Pisimisis's user avatar
1 vote
1 answer
39 views

Question about the Fisher information metric

suppose we have the two one dimensional gaussian probability distribution functions $f(x)$ and $g(x)$ with parameters $P=(\mu_1,\sigma_1)$ and $Q=(\mu_2,\sigma_2)$ so we know that we can give the ...
Andyale's user avatar
  • 181
0 votes
0 answers
28 views

Reference Hellinger distance as a geodesic distance

I consider a statistical manifold equipped with the Fisher Information Metric. I want to show that for the exponential family (with no additional constraint), the Hellinger distance coincides with the ...
Ramufasa's user avatar
1 vote
0 answers
45 views

What am I doing wrong when finding the Fisher information of a binomial distribution? [closed]

I am trying to find the Fisher information of a binomial distribution where $n=2$ and $n=\theta$. I have the log-likelihood function as $$n\ln2 + \sum^{n}_{x=1}x_i\ln \theta + (2n-\sum^{n}_{x=1}x_i)(...
Peverel Shipley's user avatar
0 votes
0 answers
22 views

Fisher Information and Parameter Space

I am reviewing Fisher information and saw that one of the requirements is that the distribution of the data, say $f(x|\theta)$, involves a parameter $\theta$ that is unknown but lies within a given ...
kpr62's user avatar
  • 571
0 votes
2 answers
179 views

Deriving the Fisher information matrix for a reparameterised gamma distribution

Let $X \sim \mathrm{Gamma}(\alpha, \theta),$ where $$f(x) = \frac {x^{\alpha - 1} e^{-\frac x \theta}} {\theta^{\alpha}\Gamma(\alpha)}.$$ The log-likelihood function can be shown to be $$l(\alpha, \...
Ethan Mark's user avatar
  • 2,187
0 votes
0 answers
102 views

Intuition for vector calculus

In my statistics class, I was introduced to Fisher Information. As it comes from the Taylor Expansion in vector form, I wanted to know terms were ordered in a certain way - whether it was just to make ...
Jackanap3s's user avatar
0 votes
0 answers
34 views

Differential inequality with KL-divergence and covariance

Let $p_t$ and $q_t$ be two families of probability densities on $\mathbb{R}^d$ indexed by time $t\geq 0$. Does the following differential inequality imply that the KL-divergence is identically zero? $$...
Vasily Ilin's user avatar
2 votes
0 answers
41 views

Fisher information with known moments

I have a sequence $X^n$ of length $n$, where each $X_i$ takes a value from a finite set with probability vector $\mathbf{p} = [p_1, \ldots, p_K]^T$, i.e., $X_i \in [K]$, where $p_{X_i}(k) = p_k, k = 1,...
Abas's user avatar
  • 355
0 votes
0 answers
12 views

Difference between Likelihood Estimation and CRLB Estimation for Cooperative Radar

I do not know if this question fits this stack but I do not know if there's other place where I can ask. The question is about the difference between the cooperative/collaborative radar system when ...
Loco Citato's user avatar
1 vote
0 answers
44 views

About Calculate fisher information of normal distribution [closed]

Suppose $X_1, \ldots, X_n$ are iid $\mathrm{N}(0, \exp (2 \gamma))$; that is, the density of $X_i$ is $$ (2 \pi)^{-1 / 2} e^{-\gamma} \exp \left(-x^2 e^{-2 \gamma} / 2\right) . $$ I want to calculate ...
trivial_fish's user avatar
0 votes
0 answers
32 views

Derive Cramer-Rao lower bound for $Var(\hat{\theta})$ given that $\mathbb{E}[\hat{\theta}U]=1$

I am trying to derive the Cramer-Rao lower bound for $Var(\hat{\theta})$ given that we already know $\mathbb{E}[U]=0$, $Var(U)=I(\theta)$ and $\mathbb{E}[\hat{\theta}U]=1$. I am struggling with using ...
Lucas's user avatar
  • 1

15 30 50 per page
1
2 3 4 5
13