Skip to main content

All Questions

1 vote
1 answer
102 views

Formal definition of sufficient statistic

Let $(\Omega_X,\mathcal{F}_X)$ and $(\Omega _T,\mathcal{F}_T)$ be measurable spaces. Let $\mathfrak{M}$ be a family of probability measures on $(\Omega_X,\mathcal{F}_X)$. Let $X:\Omega\to \Omega _X$ ...
rfloc's user avatar
  • 133
2 votes
1 answer
115 views

Borel-Cantelli lemma on conditional probabilities

In a probability space $\big( \Omega, \mathcal{F}, P \big)$, suppose $\{E_n\}_{n\in \mathbb{N}} \subseteq \mathcal{F}$ is a sequence of mutually independent events. By Borel-Cantelli Lemma, the ...
Sanae Kochiya's user avatar
0 votes
0 answers
24 views

The concept and notation about MLE(Likelihood) and MAP

In generally, we say that X1, X2, ..., Xi are from a certain distribution, which can be represented by f(x;θ), where θ is an unknown parameter. When I read content related to MLE or the Likelihood ...
Hou ZeYu's user avatar
  • 101
0 votes
0 answers
58 views

Expected value of a variable that depends on two other random variables, and one of these random variables depends on another random variable

I am trying to solve the problem using conditional expectations. The expected value H is depends on the waiting time T and a set threshold X (a real number that is a constant random variable during ...
ton_K's user avatar
  • 1
1 vote
1 answer
62 views

If $P(A|D) > P(A)$ and $P(B|D) > P(B)$, then is $P(A \cap B|D) > P(A \cap B)$?

There are 3 events $A, B, D$ such that $D$ makes $A$ more likely and $D$ makes $B$ more likely. Does this mean that $D$ makes it more likely that both $A$ and $B$ occur? How can you prove this using ...
katln's user avatar
  • 11
0 votes
1 answer
46 views

How to mathematically express the fact that the conditional probability $P(Y|X)$ can be independent of $P(X)$?

Mathematically, $P(Y|X) = \frac{P(X,Y)}{P(X)}$ and so $P(Y|X)$ must depend on $P(X)$. Since $P(Y|X)$ will change when $P(X)$ changes. However, consider this scenario: X = amount of red meat consumed ...
Legendre's user avatar
  • 217
1 vote
1 answer
69 views

What is going on? Contradictory results on the variance of random vector with random mean and covariance

Suppose $f\mid\mu, F\sim N(\mu, F) \in \mathbb{R}^n$, where $\mu , F$ are both random (random vectors and random matrix respectively). What is the correct way to derive $Var(f)$? First, let $\tilde{f} ...
K C's user avatar
  • 51
1 vote
0 answers
58 views

Binomial distribution conditional on the weigthed sum?

Suppose $\mathbf{X}$ is a vector of iid Bernoulli variables with the fixed success probability of $p$. The variance of X is $np(1-p)$. Now, suppose, I am interested in the conditional probability of $...
entropy's user avatar
  • 19
1 vote
0 answers
16 views

Expression for Markov Kernel sampling indeces in $\{0, \ldots, T\}$ according to weights depending on another variable

I have a vector $x = (x_0, \ldots, x_T)$ and given this vector, I would like to sample an index $k$ between $0$ and $T$. The probability of sampling index $k$ is given by a weight $w_k$ that is a ...
Euler_Salter's user avatar
  • 2,236
1 vote
0 answers
42 views

Formula of $\text{Var}[X|Y,Z]$ for $X\sim \mathcal N(\mu_X,\sigma_X^2)$, $Y\sim \mathcal N(\mu_Y,\sigma_Y^2)$, $Z\sim \mathcal N(\mu_Z,\sigma_Z^2)$? [duplicate]

How do I condition the variance of a normally distributed random variable on two other normally distributed random variables? How do I condition the expectation of a normally distributed random ...
anonymous 's user avatar
3 votes
1 answer
447 views

Prove that the sum is sufficient using using the definition of sufficiency

If $X_1,\ldots,X_n$ is an IID random sample, with $X_i\sim\,\text{Ber}(\theta)$, prove that $Y = \sum_i X_i$ is sufficient using the definition of sufficiency (not the factorization criterion). Now ...
laurab's user avatar
  • 145
0 votes
1 answer
644 views

Ising's model and conditional probabilities

I am trying to understand how the conditional probabilities of an Ising model having the following joint probability is given by a logistic regression model, as shown in the joint image. I'm more than ...
Booba's user avatar
  • 3
2 votes
2 answers
203 views

How to find the probability of an unobserved binary variable from repeated noisy observations?

Let $Y \in \{0,1\}; P(Y=1)=\beta$. We have no observations of $Y$. Instead, we observe a sample of $A$,$B$. We can assume that $P(A,B|Y)=P(A|Y)P(B|Y)$; $P(A=Y)=P(B=Y)=\alpha$; and that $P(A=Y|Y)=P(A=Y)...
groceryheist's user avatar
0 votes
0 answers
32 views

Estimating probability of an event at multiple points in time

I have Machine learning Model which estimates the churn probability in the next 6 months , for a given customer. Is there any mathematical way to estimate the ...
function's user avatar
  • 141
0 votes
1 answer
93 views

hypothetical statistical test - type I and type II errors

A hypothetical statistical hypothesis test that can be used for any type of hypothesis is conducted by drawing a random number between 0 and 1 and rejecting the null hypothesis if it is less than 0.05,...
Cabbage Roll's user avatar

15 30 50 per page
1
2 3 4 5
11