Skip to main content

All Questions

5 votes
2 answers
523 views

Asymptotic unbiasedness + asymptotic zero variance = consistency?

Here, Ben shows that an unbiased estimator $\hat\theta$ of a parameter $\theta$ that has an asymptotic variance of zero converges in probability to $\theta$. That is, $\hat\theta$ is a consistent ...
Dave's user avatar
  • 65k
1 vote
0 answers
23 views

Degrees of freedom for estimation

In the context of estimators, why is it that in general dividing by the degrees of freedom(instead of the sample size) leads to unbiasedness? I see the value in substituting degrees of freedom for ...
secretrevaler's user avatar
1 vote
0 answers
58 views

Is there a good review on complete class theorems?

I'm trying to get an overview of the various results called "complete class theorems" and their relatives, especially the ones that say things along the lines of "every admissible ...
N. Virgo's user avatar
  • 425
1 vote
0 answers
120 views

Distribution of $F_n^{-1}(3/4)-F_n^{-1}(1/4)$ [closed]

Given $X_1,X_2,...X_n\overset{\text{iid}}{\sim}F$, find the distribution of the sample inter quartile range, $F_n^{-1}(3/4)-F_n^{-1}(1/4)$ in terms of $F$ where, $F_n$ is the emperical distribution ...
reyna's user avatar
  • 385
4 votes
1 answer
109 views

Probability mass function of sample median (Bootstrap)

Consider a sample $X_1,X_2,...X_n\overset{\text{iid}}{\sim}F$. Let $T_n=F_n^{-1}(1/2)$ be the sample median where, $F^{-1}(x)=\inf\{t:F(t)\ge x\}$ and $F_n(y)=\frac{1}{n}\sum_{i=1}^n\mathbb{I}(X_i\le ...
reyna's user avatar
  • 385
0 votes
0 answers
14 views

Optimality criterion for mean estimators

Assume a sample size of $n>5$, a given variance $\sigma^2 > 0$ and a $\delta \in (2e^{-n/4}, 1/2)$. Proof that there exists a distribution with variance $\sigma^2$ such that for any mean ...
MrLCh's user avatar
  • 1
1 vote
1 answer
119 views

Difference between consistent and unbiased estimator [duplicate]

I have a problem where I have to think of an example to explain a practical example of consistency and unbiased. The example I thought of is the sample mean. Consistency is when the estimator (sample ...
stats_noob's user avatar
3 votes
1 answer
52 views

Properties of statistical estimators when data is a collection of estimates

Assume I have a statistical estimator $\theta$ that has nice properties (say, unbiased and consistent) when the data $Y=\{y_1,y_2,\dots,y_n\}$ is i.i.d. (possibly with additional assumptions). But now,...
12345's user avatar
  • 213
3 votes
1 answer
166 views

How does Huber compute the $\operatorname{var}(s_n)/E[s_n]^2$ and $\operatorname{var}(d_n)/E[d_n]^2$?

(N.B. I am cross posting this question from math stackexchange since after x days I have still not received any responses.) How does Huber in book 'Robust statistical procedures' in chapter 1 compute ...
peter's user avatar
  • 31
4 votes
1 answer
300 views

Cramer-Rao lower bound for the variance of unbiased estimators of $\theta = \frac{\mu}{\sigma}$

Let $X_1, \cdots, X_n$ be a sample from the $N(\mu, \sigma^2)$ density, where $\mu, \sigma^2$ are unknown. I want to find a lower bound $L_n$ which is valid for all sample-sizes $n$ for the variance ...
Oscar24680's user avatar
1 vote
1 answer
137 views

Fisher Information for $\bar{X}^2 - \frac{\sigma^2}{n}$ with $X_1, \dots, X_n$ normally distributed

I need to find the Fisher Information for $T = \bar{X}^2 - \frac{\sigma^2}{n}$ with $X_1, \dots, X_n$ normally distributed sample with unknow mean $\mu$ and know variance $\sigma^2$. For this I'm ...
Peter Languilla's user avatar
2 votes
0 answers
61 views

Is there a theory of M-Estimation for non-unique argmins?

Given some i.i.d. random variables $x_1,\ldots,x_n\in\mathbb R^d$, an M-estimator $\hat\theta_n\in\mathbb R^p$ is a parameter which minimizes $$\hat\theta_n=\arg\min_{\theta\in\Theta} \sum_{i=1}^n\...
Stratos supports the strike's user avatar
6 votes
3 answers
241 views

What 's the $(\Omega,\mathcal{F},P_{\theta} )$ those $T_{n}$ defined on?

Definition (Consistency) Let $T_1,T_2,\cdots,T_{n},\cdots$ be a sequence of estimators for the parameter $g(\theta)$ where $T_{n}=T_{n}(X_1,X_2,\cdots,X_{n})$ is a function of $X_{1},X_{2},\cdots,X_{n}...
Elisa's user avatar
  • 330
2 votes
1 answer
167 views

I need to prove that $\hat\theta=\max\{X_1,...,X_n\}$ is a mean square consistent estimator for $\theta$

Let $X_1,...,X_n$ a i.i.d from a population with distribution $U[0,\theta]$, i.e., $f_{X_i}(x)=\frac{1}{\theta}g_{[0,\theta]}(x)$, for $i=1, \ldots, n$ where \begin{align} g_{[0,\theta]}(x) = \begin{...
Willow Douglas's user avatar
1 vote
1 answer
59 views

How to find asymptotically normal estimator if I know probability density function [closed]

I have $X_1, X_2,\ldots,X_n$ be a random sample of size n from a distribution with probability density function: $$p(x) = \theta^2xe^{-\theta x}I (x > 0).$$ How can I find an asymptotically normal ...
Karlos Margaritos's user avatar

15 30 50 per page
1
2 3 4 5
7