Skip to main content

All Questions

Tagged with
0 votes
1 answer
48 views

Are all random variables estimators? [duplicate]

My hand-wavey understanding is a random variable is a function from a domain of possible outcomes in a sample space to a measurable space valued in real numbers. We might denote a random variable from ...
Estimate the estimators's user avatar
2 votes
0 answers
21 views

When are mean and variance estimates uncorrelated or independent

I know that in the case of the normal distribution, the MLE estimates of the mean and the variance are independent. My impression is that this is a rare property for a distribution to have. Are there ...
Snildt's user avatar
  • 121
1 vote
1 answer
218 views

Is convergence in probability implied by consistency of an estimator?

Every definition of consistency I see mentions something convergence in probability-like in its explanation. From Wikipedia's definition of consistent estimators: having the property that as the ...
Estimate the estimators's user avatar
3 votes
1 answer
78 views

Best estimate of conditional probability P(C|A and B) from P(C|A) and P(C|B)?

Assume I have three events A, B, and C, and I know the following probabilities: Scenario 1: $P(A)$ and $P(B)$ $P(C|A)$ and $P(C|B)$ Scenario 2: I additionally know $P(C)$. I am looking for $P(C|A\...
Remirror's user avatar
  • 131
6 votes
3 answers
241 views

What 's the $(\Omega,\mathcal{F},P_{\theta} )$ those $T_{n}$ defined on?

Definition (Consistency) Let $T_1,T_2,\cdots,T_{n},\cdots$ be a sequence of estimators for the parameter $g(\theta)$ where $T_{n}=T_{n}(X_1,X_2,\cdots,X_{n})$ is a function of $X_{1},X_{2},\cdots,X_{n}...
Elisa's user avatar
  • 330
2 votes
1 answer
167 views

I need to prove that $\hat\theta=\max\{X_1,...,X_n\}$ is a mean square consistent estimator for $\theta$

Let $X_1,...,X_n$ a i.i.d from a population with distribution $U[0,\theta]$, i.e., $f_{X_i}(x)=\frac{1}{\theta}g_{[0,\theta]}(x)$, for $i=1, \ldots, n$ where \begin{align} g_{[0,\theta]}(x) = \begin{...
Willow Douglas's user avatar
1 vote
0 answers
156 views

Influence Function of M-Estimator

I know the following influence function for a M-Estimator: $IF(x_0,T,F_0)= $ $\frac{\psi(x_0)}{\mathbb{E}_{F_0}[\psi'(X)]}$ where $F_0$ is the centered model ($F_{\theta}(x)=F_0(x-\theta)$) I am ...
Jonathan Baram's user avatar
1 vote
1 answer
3k views

Sufficient statistics for bernoulli distribution

Let $Y_1, \ldots, Y_n $ be a random sample of size $n$ where each $Y_i \sim \textrm{Bernoulli}(p), $ and let $Y = \sum Y_i $ for $i = 1, \ldots, n.$ The estimator is $W= (Y+1)/(n+2). $ Is the ...
asjndna999's user avatar
1 vote
1 answer
2k views

Maximum likelihood estimator for power law with negative exponent

Background I have data that roughly follows a power law with a negative exponent (up to a point; also, the parameters of the "fit" were just guesstimated by eye as a demonstration): Now I ...
mapf's user avatar
  • 105
0 votes
0 answers
139 views

Subscripts for Expectations and variances in for estimators [duplicate]

Is there any significance for subscripts to E and Var? For example, the risk function of an estimator $\delta(\mathbf x)$ of $\theta$ in my book is: $$ R(\theta,\delta)=E_\theta[L(\theta,\delta(\...
Zachary Peskin's user avatar
1 vote
0 answers
82 views

Estimating conditional probability when events are sampled

Suppose I have many people who eat different fruits (apples, oranges, bananas &c): ...
sds's user avatar
  • 2,246
2 votes
0 answers
73 views

Different regularity conditions for finite population CLT

I am having trouble understanding the different regularity conditions for different versions of the finite population central limit theorem. I would greatly appreciate any help or insight anyone has. ...
Student_718's user avatar
1 vote
0 answers
77 views

Rewriting integral/summation as weighting estimator

I recently read a biostats paper which featured the following identity: $$ \sum_{y, l, m} y P(y, l, m \mid c, a) \frac{P(l \mid a, c) P\left(m \mid a^{*}, c\right)}{P(l, m \mid c, a)}=E\left(Y \frac{P\...
Alex's user avatar
  • 11
1 vote
1 answer
78 views

Did I correctly apply the factorisation theorem in this example?

Suppose that we have a density $f(x,\theta)=c(\theta)\psi(x)\unicode{x1D7D9}(x \in]\theta,\theta+1[)$ and the random variable $\mathbf{X}=(X_1,\ldots,X_n)$ are independently identically distributed ...
Hijaw's user avatar
  • 155
1 vote
1 answer
3k views

Sufficient estimator for Bernoulli distribution using the likelihood function theorem for sufficiency

Let $(X_1,X_2)$ be a random sample of two iid random variables, $X_1\sim Ber(\theta),\theta\in (0,1)$. Use the following theorem to show that $\hat{\theta}=X_1+2X_2$ is sufficient. Likelihood theorem ...
stats19's user avatar
  • 61

15 30 50 per page