1
$\begingroup$

Suppose $\mathbf{X}$ is a vector of iid Bernoulli variables with the fixed success probability of $p$. The variance of X is $np(1-p)$.

Now, suppose, I am interested in the conditional probability of $s$ successes given the weighted sum of Bernoulli RV, formally, $P(1^TX=s|w^TX = w^Tx)$. How would that pmf look like?

Particularly, how could I prove that $Var(1^TX) \geq Var(1^TX|w^Tx)$?

$\endgroup$
3
  • 1
    $\begingroup$ The conditional probability distribution of the unweighted sum is going to depend on the actual weights and their relationship: there will be weights where the weighted sum fully determines the unweighted sum and so there is $0$ conditional variance, such as when $n=2$ and the weights are not equal. $\endgroup$
    – Henry
    Commented May 29, 2023 at 9:09
  • $\begingroup$ Variance of the LHS is just p(1-p) because it is simply a binomial distribution. In the limiting case the two variances are just equal to each other. I do not see how you concluded that rhs will be zero. I do not think so. $\endgroup$
    – entropy
    Commented May 29, 2023 at 19:41
  • $\begingroup$ Suppose $n=2 $ and the weights are $w_1=1$ and $w_2=10$: then the possible weighted sums $w^Tx$ are $0$ (in which case the unweighted sum is $0$ with conditional variance $0$), or $1$ (in which case the unweighted sum is $1$ with conditional variance $0$), or $10$ (in which case the unweighted sum is $1$ with conditional variance $0$), or $11$ (in which case the unweighted sum is $2$ with conditional variance $0$) $\endgroup$
    – Henry
    Commented May 29, 2023 at 20:29

0