Skip to main content

All Questions

Tagged with
0 votes
1 answer
23 views

Unbiased and consistent estimator with positive sampling variance as n approaches infinity? (Aronow & Miller) [duplicate]

In Aronow & Miller, "Foundations of Agnostic Statistics", the authors write on p105: [A]lthough unbiased estimators are not necessarily consistent, any unbiased estimator $\widehat{\...
user24465's user avatar
5 votes
2 answers
523 views

Asymptotic unbiasedness + asymptotic zero variance = consistency?

Here, Ben shows that an unbiased estimator $\hat\theta$ of a parameter $\theta$ that has an asymptotic variance of zero converges in probability to $\theta$. That is, $\hat\theta$ is a consistent ...
Dave's user avatar
  • 65k
0 votes
0 answers
18 views

What is the difference between unbiasedness, consistency and efficiency of estimators? How are these interrelated among themselves? [duplicate]

!Efficiency(https://stackoverflow.com/20240427_193105.jpg). Given snapshot of the book states that among the class of consistent estimators, in general, more than one consistent estimator of a ...
Parth's user avatar
  • 1
0 votes
1 answer
36 views

Assumptions needed for consistency of plug-in estimator

Assume $X,Z$ are random variables and let $x_0$ be a fixed number. I want to estimate $A =\mathbb{E}_{X,Z}[\frac{X}{P(X=x_0|Z)}]$. If $P(X=x_0|Z=z)$ is known for all $z$ we can apply the LLN and ...
James's user avatar
  • 1
4 votes
1 answer
96 views

Mathematical Step for consistency

Let me state my problem from the beginning: Let $i$ be an index representing countries ($i = {1,2,\ldots,N }$), and $t$ represent time, denoted as available data for country $i$ ($t = {1,2,\ldots,T_i }...
Maximilian's user avatar
1 vote
1 answer
119 views

Difference between consistent and unbiased estimator [duplicate]

I have a problem where I have to think of an example to explain a practical example of consistency and unbiased. The example I thought of is the sample mean. Consistency is when the estimator (sample ...
stats_noob's user avatar
1 vote
1 answer
218 views

Is convergence in probability implied by consistency of an estimator?

Every definition of consistency I see mentions something convergence in probability-like in its explanation. From Wikipedia's definition of consistent estimators: having the property that as the ...
Estimate the estimators's user avatar
1 vote
1 answer
69 views

Does increasing number of observations lead to the decreasing of Mean Square Error of consistent estimators?

I know that not all weakly consistent estimators exhibit MSE-consistency : https://stats.stackexchange.com/a/610835/397467. Anyway, does increasing the sample size leads to a reduction in their mean ...
whn's user avatar
  • 11
2 votes
2 answers
682 views

What does the likelihood function converge to when sample size is infinite?

Let $\mathcal{L}(\theta\mid x_1,\ldots,x_n)$ be the likelihood function of parameters $\theta$ given i.i.d. samples $x_i$ with $i=1,\ldots,n$. I know that under some regularity conditions the $\theta$ ...
Tendero's user avatar
  • 956
6 votes
3 answers
241 views

What 's the $(\Omega,\mathcal{F},P_{\theta} )$ those $T_{n}$ defined on?

Definition (Consistency) Let $T_1,T_2,\cdots,T_{n},\cdots$ be a sequence of estimators for the parameter $g(\theta)$ where $T_{n}=T_{n}(X_1,X_2,\cdots,X_{n})$ is a function of $X_{1},X_{2},\cdots,X_{n}...
Elisa's user avatar
  • 330
2 votes
1 answer
167 views

I need to prove that $\hat\theta=\max\{X_1,...,X_n\}$ is a mean square consistent estimator for $\theta$

Let $X_1,...,X_n$ a i.i.d from a population with distribution $U[0,\theta]$, i.e., $f_{X_i}(x)=\frac{1}{\theta}g_{[0,\theta]}(x)$, for $i=1, \ldots, n$ where \begin{align} g_{[0,\theta]}(x) = \begin{...
Willow Douglas's user avatar
0 votes
1 answer
142 views

Consistent or inconsistent estimator

If $\hat{\theta}_n$ is an estimator for the parameter $\theta$, then the two sufficient conditions to ensure consistency of $\hat{\theta}_n$ are: Bias($\hat{\theta}_n)\to 0$ and Var$(\hat{\theta}_n)\...
user380598's user avatar
3 votes
1 answer
88 views

Consistency of a simple estimator for $y_i = \beta_1 x_i + u_i$

Let $y_i = \beta_1 x_i + u_i$ for $i=1,2,..,n$. If I define $$\hat \beta_1 = \frac{y_1 + y_n}{x_1 + x_n}$$ then whether my $\hat \beta_1$ will be consistent or not in this setup? For my estimator to ...
Ujjwal's user avatar
  • 43
1 vote
0 answers
18 views

Are the classical moments consistently estimated from a single realization drawn from a given PSD?

Given a sequence $\{x_k\}_{k=-N}^{N}$ having power spectral density $S(f)$, we know that that "single realization PSD" $$ \frac{\Delta t^2}{T} \left| \sum_{k=-N}^{N} x_n \exp(-2\pi i f n \...
user14717's user avatar
  • 215
1 vote
0 answers
34 views

Does a linear regression assume that the (unconditional) predictor data is i.i.d?

Say I have a linear, cross sectional relationship - $y_{i}=x_{i}b+e_{i}$. Where $E(e_{i}|X_{j})=0$ for all relevant $i,j$. Given this, one can prove that the OLS estimator is unbiased. However, ...
user121416's user avatar

15 30 50 per page
1
2 3 4 5