All Questions
Tagged with estimators probability
35
questions
0
votes
1
answer
48
views
Are all random variables estimators? [duplicate]
My hand-wavey understanding is a random variable is a function from a domain of possible outcomes in a sample space to a measurable space valued in real numbers.
We might denote a random variable from ...
2
votes
0
answers
21
views
When are mean and variance estimates uncorrelated or independent
I know that in the case of the normal distribution, the MLE estimates of the mean and the variance are independent. My impression is that this is a rare property for a distribution to have. Are there ...
1
vote
1
answer
218
views
Is convergence in probability implied by consistency of an estimator?
Every definition of consistency I see mentions something convergence in probability-like in its explanation.
From Wikipedia's definition of consistent estimators:
having the property that as the ...
3
votes
1
answer
78
views
Best estimate of conditional probability P(C|A and B) from P(C|A) and P(C|B)?
Assume I have three events A, B, and C, and I know the following probabilities:
Scenario 1:
$P(A)$ and $P(B)$
$P(C|A)$ and $P(C|B)$
Scenario 2:
I additionally know $P(C)$.
I am looking for $P(C|A\...
6
votes
3
answers
241
views
What 's the $(\Omega,\mathcal{F},P_{\theta} )$ those $T_{n}$ defined on?
Definition (Consistency)
Let $T_1,T_2,\cdots,T_{n},\cdots$ be a sequence of estimators for the parameter $g(\theta)$ where $T_{n}=T_{n}(X_1,X_2,\cdots,X_{n})$ is a function of $X_{1},X_{2},\cdots,X_{n}...
2
votes
1
answer
167
views
I need to prove that $\hat\theta=\max\{X_1,...,X_n\}$ is a mean square consistent estimator for $\theta$
Let $X_1,...,X_n$ a i.i.d from a population with distribution $U[0,\theta]$, i.e.,
$f_{X_i}(x)=\frac{1}{\theta}g_{[0,\theta]}(x)$, for $i=1, \ldots, n$
where
\begin{align}
g_{[0,\theta]}(x) =
\begin{...
1
vote
0
answers
156
views
Influence Function of M-Estimator
I know the following influence function for a M-Estimator:
$IF(x_0,T,F_0)= $ $\frac{\psi(x_0)}{\mathbb{E}_{F_0}[\psi'(X)]}$
where $F_0$ is the centered model ($F_{\theta}(x)=F_0(x-\theta)$)
I am ...
1
vote
1
answer
3k
views
Sufficient statistics for bernoulli distribution
Let $Y_1, \ldots, Y_n $ be a random sample of size $n$ where each $Y_i \sim \textrm{Bernoulli}(p), $ and
let $Y = \sum Y_i $ for $i = 1, \ldots, n.$
The estimator is $W= (Y+1)/(n+2). $
Is the ...
1
vote
1
answer
2k
views
Maximum likelihood estimator for power law with negative exponent
Background
I have data that roughly follows a power law with a negative exponent (up to a point; also, the parameters of the "fit" were just guesstimated by eye as a demonstration):
Now I ...
0
votes
0
answers
139
views
Subscripts for Expectations and variances in for estimators [duplicate]
Is there any significance for subscripts to E and Var?
For example, the risk function of an estimator $\delta(\mathbf x)$ of $\theta$ in my book is:
$$
R(\theta,\delta)=E_\theta[L(\theta,\delta(\...
1
vote
0
answers
82
views
Estimating conditional probability when events are sampled
Suppose I have many people who eat different fruits (apples, oranges, bananas &c):
...
2
votes
0
answers
73
views
Different regularity conditions for finite population CLT
I am having trouble understanding the different regularity conditions for different versions of the finite population central limit theorem. I would greatly appreciate any help or insight anyone has.
...
1
vote
0
answers
77
views
Rewriting integral/summation as weighting estimator
I recently read a biostats paper which featured the following identity:
$$
\sum_{y, l, m} y P(y, l, m \mid c, a) \frac{P(l \mid a, c) P\left(m \mid a^{*}, c\right)}{P(l, m \mid c, a)}=E\left(Y \frac{P\...
1
vote
1
answer
78
views
Did I correctly apply the factorisation theorem in this example?
Suppose that we have a density $f(x,\theta)=c(\theta)\psi(x)\unicode{x1D7D9}(x \in]\theta,\theta+1[)$ and the random variable $\mathbf{X}=(X_1,\ldots,X_n)$ are independently identically distributed ...
1
vote
1
answer
3k
views
Sufficient estimator for Bernoulli distribution using the likelihood function theorem for sufficiency
Let $(X_1,X_2)$ be a random sample of two iid random variables, $X_1\sim Ber(\theta),\theta\in (0,1)$.
Use the following theorem to show that $\hat{\theta}=X_1+2X_2$ is sufficient.
Likelihood theorem ...