Skip to main content

Questions tagged [estimators]

A rule for calculating an estimate of a given quantity based on observed data [Wikipedia].

2 votes
0 answers
26 views

Model has higher (and closer to 1) $\beta$, but similar $R^2$ and correlation

I have model one which produces prediction $\hat{y_1}$, later I came up with a new model which produces prediction $\hat{y_2}$. I have ground truth $y$. The models are not regression based but they ...
zvi's user avatar
  • 21
0 votes
0 answers
13 views

Combining information from different quantiles

I have a number of "mostly" Gaussian distributions (in truth a Gauss core and longer tails). I am interested in the width of this distributions. Given that I do not know the amount of tails ...
nyw's user avatar
  • 21
2 votes
1 answer
61 views

variance of the estimator of unconditional mean of AR(1) process

AR(1) process is defined as: $y_t=c+\phi y_{t-1}+\varepsilon_t$ where $\varepsilon_t$ is IID with mean zero and variance $\sigma^2<\infty$. For a stationary process, i.e. $\phi\ne 0$, the ...
Aksakal's user avatar
  • 61.8k
0 votes
1 answer
48 views

Are all random variables estimators? [duplicate]

My hand-wavey understanding is a random variable is a function from a domain of possible outcomes in a sample space to a measurable space valued in real numbers. We might denote a random variable from ...
Estimate the estimators's user avatar
5 votes
3 answers
682 views

confidence intervals for proportions containing a theoretically impossible value (zero)

This is really a hypothetical question not related to an actual issue I have, so this question is just out of curiosity. I'm aware of this other related question What should I do when a confidence ...
Coris's user avatar
  • 53
2 votes
0 answers
46 views

No Existence of Efficient estimator

I need to prove that given $(X_1,...,X_n)$ from the density $$\frac{1}{\theta}x^{\frac{1}{\theta}-1}1_{(0,1)}$$ no efficient estimator exists for $g(\theta)$=$\frac{1}{{\theta}+1}$. I have shown that ...
Onofrio Olivieri's user avatar
0 votes
0 answers
41 views

Unbiased Estimator of Nugget Effect

Question: I am trying the measure the nugget effect, which is parameterized by $(1-\lambda)$ in the following variance-covariance used to describe the multivariate normal distribution of my n-...
A Friendly Fish's user avatar
4 votes
1 answer
49 views

How to accurately estimate the probability of a rare event in a large dataset?

I have a dataset of 30,155 names and out of curiosity I verified that the longest name has 68 characters, which is quite big considering the mean and SD were 24.78 and 5.64, respectively. Based on ...
WordP's user avatar
  • 141
1 vote
0 answers
39 views

Calculating the mean and error for correlated measurements involving different estimators and quantiles

My goal is to find a way to report a mean $\pm$ error for different estimators and quantiles of the same distribution (same measurement). I am measuring the width of a distribution (Gaussian core and ...
nyw's user avatar
  • 21
0 votes
1 answer
23 views

Unbiased and consistent estimator with positive sampling variance as n approaches infinity? (Aronow & Miller) [duplicate]

In Aronow & Miller, "Foundations of Agnostic Statistics", the authors write on p105: [A]lthough unbiased estimators are not necessarily consistent, any unbiased estimator $\widehat{\...
user24465's user avatar
3 votes
2 answers
76 views

Maximum liklihood estimators for simple linear regression with $\sigma^2$ unknown

Suppose that we have the simple linear regression model for the form: $$Y_i = \beta X_i +\varepsilon_i$$ With the following set of 'classical assumptions' holding: $E(\varepsilon_i)=0$ $Var(\...
hmmmm's user avatar
  • 539
2 votes
1 answer
120 views

Sum of asymptotically independent random variables - Convergence

Let $\theta_N=\frac{1}{N}\sum_{i=1}^N \pi_i\cdot g_i$ where $0<\pi_i<1$ and $0<g_i<1/\pi_i$ such that $\theta_N\overset{N\rightarrow \infty}{\rightarrow}\theta$. If $X_i\sim Ber(\pi_i)$, I ...
Pierfrancesco Alaimo Di Loro's user avatar
4 votes
2 answers
124 views

Must maximum likelihood method be applied on a simple random sample or on a realisation?

I guess my trouble is not a big one but here it is: when one applies maximum likelihood, he considers the realization $(x_1, \dots, x_n)$ of a simple random sample (SRS), leading to ML Estimates. But ...
MysteryGuy's user avatar
5 votes
2 answers
523 views

Asymptotic unbiasedness + asymptotic zero variance = consistency?

Here, Ben shows that an unbiased estimator $\hat\theta$ of a parameter $\theta$ that has an asymptotic variance of zero converges in probability to $\theta$. That is, $\hat\theta$ is a consistent ...
Dave's user avatar
  • 65k
0 votes
0 answers
22 views

Is Coefficient of Variation a valid measure of relative efficiency?

I'm wondering if it is always valid to use Coefficient of Variation (CV) to determine relative efficiency of parameter estimators, and to compute statistically equivalent sample sizes based on that ...
feetwet's user avatar
  • 1,162
1 vote
1 answer
72 views

Using Rao-Blackwell to improve the estimator of P(X/Y < t)

X and Y are independent N (0, 1) random variables, we want to approximate P (X/Y ≤ t), for a fixed number t. The first part of the problem was to describe a naive Monte Carlo estimate. I described ...
stat_student123's user avatar
0 votes
0 answers
18 views

What is the difference between unbiasedness, consistency and efficiency of estimators? How are these interrelated among themselves? [duplicate]

!Efficiency(https://stackoverflow.com/20240427_193105.jpg). Given snapshot of the book states that among the class of consistent estimators, in general, more than one consistent estimator of a ...
Parth's user avatar
  • 1
6 votes
1 answer
165 views

Terminology clarification about sample moments

According to MathWorld (link): "The sample raw moments are unbiased estimators of the population raw moments". While in Wikipedia (link) it is said: ...the $k$-th raw moment of a population ...
user1420303's user avatar
1 vote
1 answer
34 views

Why can we get better asymptotic global estimators even for IID random variables?

Let $X_1,...,X_N$ be IID random variables sampled from a parametrised distribution $p_\theta$, and suppose my goal is to retrieve $\theta$ from these samples. We know that the MLE provides an ...
glS's user avatar
  • 383
1 vote
0 answers
49 views

Standard practice to show Biased CRBs

I have a problem with four-parameter estimation. I have derived the variances for the estimated parameters using Monte Carlo simulations (numerical ones) and theoretical ones using the inverse of the ...
CfourPiO's user avatar
  • 235
1 vote
1 answer
54 views

What is the distribution of the unbiased estimator of variance for normally distributed variables?

I must be making some mistake in my derivation of the distribution of the unbiased variance estimator for i.i.d. $X_{i} \sim \mathcal{N}\left(\mu, \sigma^{2}\right)$. We have $\bar{X} =\frac{1}{n}\sum\...
YEp d's user avatar
  • 11
0 votes
0 answers
18 views

Demonstrating $SU=U(\sigma^2 I+D^2)$ as a Sufficient Condition in Maximum Likelihood Estimation

I am working on an exercise related to maximum likelihood estimation (in the context of principal component analysis) for the distribution $$p(x) = Gauss(b, WW^T+\sigma^2I)$$ In particular, I want to ...
Andrea's user avatar
  • 153
1 vote
0 answers
23 views

Degrees of freedom for estimation

In the context of estimators, why is it that in general dividing by the degrees of freedom(instead of the sample size) leads to unbiasedness? I see the value in substituting degrees of freedom for ...
secretrevaler's user avatar
0 votes
1 answer
36 views

Assumptions needed for consistency of plug-in estimator

Assume $X,Z$ are random variables and let $x_0$ be a fixed number. I want to estimate $A =\mathbb{E}_{X,Z}[\frac{X}{P(X=x_0|Z)}]$. If $P(X=x_0|Z=z)$ is known for all $z$ we can apply the LLN and ...
James's user avatar
  • 1
2 votes
0 answers
21 views

When are mean and variance estimates uncorrelated or independent

I know that in the case of the normal distribution, the MLE estimates of the mean and the variance are independent. My impression is that this is a rare property for a distribution to have. Are there ...
Snildt's user avatar
  • 121
5 votes
2 answers
128 views

Sufficient conditions for asymptotic efficiency of MLE

Maximum-likelihood estimators are, according to Wikipedia, asymptotically efficient, that is they achieve the Cramér-Rao bound when sample size tends to infinity. But this seems to require some ...
Luis Mendo's user avatar
  • 1,099
1 vote
0 answers
58 views

Is there a good review on complete class theorems?

I'm trying to get an overview of the various results called "complete class theorems" and their relatives, especially the ones that say things along the lines of "every admissible ...
N. Virgo's user avatar
  • 425
1 vote
2 answers
94 views

Covariance of Best Linear Unbiased Estimators and arbitrary LUE

I'm working on a problem involving two linear unbiased estimators $T$ and $T'$ of a parameter $\theta$, defined from a sample $\{X_1, \dots, X_n\}$ with mean $\theta$ and finite variance. I aim to ...
Taha Rhaouti's user avatar
1 vote
0 answers
120 views

Distribution of $F_n^{-1}(3/4)-F_n^{-1}(1/4)$ [closed]

Given $X_1,X_2,...X_n\overset{\text{iid}}{\sim}F$, find the distribution of the sample inter quartile range, $F_n^{-1}(3/4)-F_n^{-1}(1/4)$ in terms of $F$ where, $F_n$ is the emperical distribution ...
reyna's user avatar
  • 385
4 votes
1 answer
109 views

Probability mass function of sample median (Bootstrap)

Consider a sample $X_1,X_2,...X_n\overset{\text{iid}}{\sim}F$. Let $T_n=F_n^{-1}(1/2)$ be the sample median where, $F^{-1}(x)=\inf\{t:F(t)\ge x\}$ and $F_n(y)=\frac{1}{n}\sum_{i=1}^n\mathbb{I}(X_i\le ...
reyna's user avatar
  • 385

15 30 50 per page
1
2 3 4 5
29