All Questions
Tagged with fisher-information estimators
12
questions
1
vote
1
answer
34
views
Why can we get better asymptotic global estimators even for IID random variables?
Let $X_1,...,X_N$ be IID random variables sampled from a parametrised distribution $p_\theta$, and suppose my goal is to retrieve $\theta$ from these samples.
We know that the MLE provides an ...
4
votes
1
answer
296
views
Cramer-Rao lower bound for the variance of unbiased estimators of $\theta = \frac{\mu}{\sigma}$
Let $X_1, \cdots, X_n$ be a sample from the $N(\mu, \sigma^2)$ density, where $\mu, \sigma^2$ are unknown.
I want to find a lower bound $L_n$ which is valid for all sample-sizes $n$ for the variance ...
1
vote
1
answer
137
views
Fisher Information for $\bar{X}^2 - \frac{\sigma^2}{n}$ with $X_1, \dots, X_n$ normally distributed
I need to find the Fisher Information for $T = \bar{X}^2 - \frac{\sigma^2}{n}$ with $X_1, \dots, X_n$ normally distributed sample with unknow mean $\mu$ and know variance $\sigma^2$. For this I'm ...
0
votes
0
answers
284
views
Fisher matrix for a discrete distribution
Let $\mathbf{X} = \{X_1, \ldots, X_n\}$ be a sample of i.i.d. variables following a discrete distribution with parameters $\mathbf{p}^T = (p_1, p_2, p_3)$. How can I find the Fisher information matrix ...
2
votes
1
answer
673
views
Cramér–Rao Lower Bound and UMVUE for $\frac1{\theta}$
Problem: Find the UMVUE of $\frac1\theta$ for a random sample from the population distribution with density $$f(x;\theta)=\theta x^{\theta-1}$$ and show that its variance reaches the Cramér–Rao lower ...
0
votes
0
answers
1k
views
Proof Sample Variance is Minimum Variance Unbiased Estimator for Unknown Mean
I am trying to prove that the unbiased sample variance is a minimum variance estimator. In this problem I have a Normal distribution with unknown mean (and the variance is the parameter to estimate so ...
1
vote
0
answers
157
views
Is this the only way to determine if a parameter can be estimated efficiently?
I am tasked with determining if a particular parameter can be estimated efficiently.
Given that an efficient estimator is an unbiased estimator which achieves the Cramer-Rao lower-bound, is the only ...
6
votes
1
answer
592
views
Does a quadratic log-likehood mean the MLE is (approximately) normally distributed?
So, in the usual case, one can prove from the asymptotic normality of a maximum likelihood estimator that the corresponding log-likehood surface is quadratic near the MLE (e.g. in the proof of the ...
2
votes
1
answer
1k
views
Variance of estimator seemingly lower than CRLB?
While practicing for a mid-term, I came across a question where I was asked to investigate the variance of $\frac{(n+1)Y_{n}}{n}$ where $Y_{n}$ is the largest observation of a random sample of size $n$...
5
votes
1
answer
3k
views
Standard error of the estimate in logistic regression
We usually get an estimate of $\beta$ in the logistic regression by finding the $MLE$ of the observed random samples of $X_1,X_2....,X_N$. Then we use Wald's test i.e. ${[\hat \beta / S.E.(\hat \beta)]...
1
vote
1
answer
340
views
Confidence interval for a function of the MLE
I am studying an old assignment in which I have calculated the MLE for a sample from an exponential distribution. It then gives the formula for the median of an exponential distribution $\ln(2)/\...
1
vote
0
answers
105
views
Fisher information $J_y(\theta)$ for transformation $y=F(x)$
Consider a multivariate random variable $x$ with density function $P_x(\theta)$ for a scalar parameter $\theta$. Assume the Fisher information $J_x(\theta)$ is known.
Now, for a transformation (...