Skip to main content
added 5 characters in body
Source Link
Michael
  • 25.1k
  • 2
  • 31
  • 53

This just summarizes my comments which led to the answer that fennel gets (also in the comment section). It generalizes to the case when the random variables $(X_1, ..., X_n)$ are exchangeable, meaning the joint distribution is invariant under permutations of the ordering, that is, for all permutations $(\sigma(1), ..., \sigma(n))$ of $\{1, ..., n\}$ and all $(x_1, \ldots, x_n)\in\mathbb{R}^n$ we have $$P[X_1\leq x_1, ..., X_n\leq x_n]=P[X_{\sigma(1)}\leq x_1, ..., X_{\sigma(n)}\leq x_n] $$ Note that i.i.d. implies exchangeable (so exchangeable is more general). Exchangeability implies all $X_i$ have the same distribution as $X_1$. In the following, the distribution of $X_1$ is arbitrary and does not need to have finite moments.


Fix $n$ as a positive integer. For $i \in \{1, ..., n\}$ define $$ Y_i = \left\{\begin{array}{cc} \frac{X_i-\overline{X}}{||X-e_n\overline{X}||_2} & \mbox{if $X-e_n\overline{X}\neq 0$} \\ 0 & \mbox{else} \end{array}\right.$$ where $e_n=(1, 1, ..., 1)$ is the all-1 vector in $\mathbb{R}^n$. Define event $A$ by $$ A = \{X - e_n\overline{X}\neq 0\}$$

Observe that $\{Y_i\}$$\{Y_i\}_{i=1}^n$ are random variables that surely satisfy
$$-1 \leq Y_i\leq 1 \quad \forall i \in \{1, 2, 3, ...\}$$$$-1 \leq Y_i\leq 1 \quad \forall i \in \{1, ..., n\}$$ $$ \sum_{i=1}^n Y_i=0 \quad, \quad \sum_{i=1}^n Y_i^2=1_A$$ Boundedness of the $Y_i$ variables ensures the first and second moments are finite. The exchangeable property of $(X_1, ..., X_n)$ means that each $Y_i$ has the same probability distribution and so $$E[Y_i]=E[Y_1], \quad E[Y_i^2]=E[Y_1^2] \quad \forall i\in\{1, ..., n\}$$ Therefore $$ \sum_{i=1}^nY_i=0 \implies nE[Y_1]=0 \implies E[Y_1]=0$$ $$\sum_{i=1}^nY_i^2 = 1_A \implies nE[Y_1^2]=P[A]\implies E[Y_1^2]=P[A]/n$$ and since $Var(Y_i)=E[Y_i^2]-0^2 = E[Y_1^2]$ we have $$\boxed{Var(Y_i) = P[A]/n \quad \forall i \in \{1, ..., n\}}$$


Observe that $$ A^c = \{X_1=X_2=...=X_n\}$$ In the special case when $n\geq 2$ and the $\{X_i\}$ random variables are i.i.d. with a continuous CDF then it can be shown $P[A]=1$, so $Var(Y_i)=1/n$ for all $i \in \{1, ..., n\}$.

This just summarizes my comments which led to the answer that fennel gets (also in the comment section). It generalizes to the case when the random variables $(X_1, ..., X_n)$ are exchangeable, meaning the joint distribution is invariant under permutations of the ordering, that is, for all permutations $(\sigma(1), ..., \sigma(n))$ of $\{1, ..., n\}$ and all $(x_1, \ldots, x_n)\in\mathbb{R}^n$ we have $$P[X_1\leq x_1, ..., X_n\leq x_n]=P[X_{\sigma(1)}\leq x_1, ..., X_{\sigma(n)}\leq x_n] $$ Note that i.i.d. implies exchangeable (so exchangeable is more general). Exchangeability implies all $X_i$ have the same distribution as $X_1$. In the following, the distribution of $X_1$ is arbitrary and does not need to have finite moments.


Fix $n$ as a positive integer. For $i \in \{1, ..., n\}$ define $$ Y_i = \left\{\begin{array}{cc} \frac{X_i-\overline{X}}{||X-e_n\overline{X}||_2} & \mbox{if $X-e_n\overline{X}\neq 0$} \\ 0 & \mbox{else} \end{array}\right.$$ where $e_n=(1, 1, ..., 1)$ is the all-1 vector in $\mathbb{R}^n$. Define event $A$ by $$ A = \{X - e_n\overline{X}\neq 0\}$$

Observe that $\{Y_i\}$ are random variables that surely satisfy
$$-1 \leq Y_i\leq 1 \quad \forall i \in \{1, 2, 3, ...\}$$ $$ \sum_{i=1}^n Y_i=0 \quad, \quad \sum_{i=1}^n Y_i^2=1_A$$ Boundedness of the $Y_i$ variables ensures the first and second moments are finite. The exchangeable property of $(X_1, ..., X_n)$ means that each $Y_i$ has the same probability distribution and so $$E[Y_i]=E[Y_1], \quad E[Y_i^2]=E[Y_1^2] \quad \forall i\in\{1, ..., n\}$$ Therefore $$ \sum_{i=1}^nY_i=0 \implies nE[Y_1]=0 \implies E[Y_1]=0$$ $$\sum_{i=1}^nY_i^2 = 1_A \implies nE[Y_1^2]=P[A]\implies E[Y_1^2]=P[A]/n$$ and since $Var(Y_i)=E[Y_i^2]-0^2 = E[Y_1^2]$ we have $$\boxed{Var(Y_i) = P[A]/n \quad \forall i \in \{1, ..., n\}}$$ In the special case when $n\geq 2$ and the $\{X_i\}$ random variables are i.i.d. with a continuous CDF then it can be shown $P[A]=1$, so $Var(Y_i)=1/n$ for all $i \in \{1, ..., n\}$.

This just summarizes my comments which led to the answer that fennel gets (also in the comment section). It generalizes to the case when the random variables $(X_1, ..., X_n)$ are exchangeable, meaning the joint distribution is invariant under permutations of the ordering, that is, for all permutations $(\sigma(1), ..., \sigma(n))$ of $\{1, ..., n\}$ and all $(x_1, \ldots, x_n)\in\mathbb{R}^n$ we have $$P[X_1\leq x_1, ..., X_n\leq x_n]=P[X_{\sigma(1)}\leq x_1, ..., X_{\sigma(n)}\leq x_n] $$ Note that i.i.d. implies exchangeable (so exchangeable is more general). Exchangeability implies all $X_i$ have the same distribution as $X_1$. In the following, the distribution of $X_1$ is arbitrary and does not need to have finite moments.


Fix $n$ as a positive integer. For $i \in \{1, ..., n\}$ define $$ Y_i = \left\{\begin{array}{cc} \frac{X_i-\overline{X}}{||X-e_n\overline{X}||_2} & \mbox{if $X-e_n\overline{X}\neq 0$} \\ 0 & \mbox{else} \end{array}\right.$$ where $e_n=(1, 1, ..., 1)$ is the all-1 vector in $\mathbb{R}^n$. Define event $A$ by $$ A = \{X - e_n\overline{X}\neq 0\}$$

Observe that $\{Y_i\}_{i=1}^n$ are random variables that surely satisfy
$$-1 \leq Y_i\leq 1 \quad \forall i \in \{1, ..., n\}$$ $$ \sum_{i=1}^n Y_i=0 \quad, \quad \sum_{i=1}^n Y_i^2=1_A$$ Boundedness of the $Y_i$ variables ensures the first and second moments are finite. The exchangeable property of $(X_1, ..., X_n)$ means that each $Y_i$ has the same probability distribution and so $$E[Y_i]=E[Y_1], \quad E[Y_i^2]=E[Y_1^2] \quad \forall i\in\{1, ..., n\}$$ Therefore $$ \sum_{i=1}^nY_i=0 \implies nE[Y_1]=0 \implies E[Y_1]=0$$ $$\sum_{i=1}^nY_i^2 = 1_A \implies nE[Y_1^2]=P[A]\implies E[Y_1^2]=P[A]/n$$ and since $Var(Y_i)=E[Y_i^2]-0^2 = E[Y_1^2]$ we have $$\boxed{Var(Y_i) = P[A]/n \quad \forall i \in \{1, ..., n\}}$$


Observe that $$ A^c = \{X_1=X_2=...=X_n\}$$ In the special case when $n\geq 2$ and the $\{X_i\}$ random variables are i.i.d. with a continuous CDF then it can be shown $P[A]=1$, so $Var(Y_i)=1/n$ for all $i \in \{1, ..., n\}$.

added 851 characters in body
Source Link
Michael
  • 25.1k
  • 2
  • 31
  • 53

This just summarizes my comments which led to the answer that fennel gets (also in the comment section). It generalizes to the case when the random variables $(X_1, ..., X_n)$ are exchangeable, meaning the joint distribution is invariant under permutations of the ordering, that is, for all permutations $(\sigma(1), ..., \sigma(n))$ of $\{1, ..., n\}$ and all $(x_1, \ldots, x_n)\in\mathbb{R}^n$ we have $$P[X_1\leq x_1, ..., X_n\leq x_n]=P[X_{\sigma(1)}\leq x_1, ..., X_{\sigma(n)}\leq x_n] $$ Note that i.i.d. implies exchangeable (so exchangeable is more general). Exchangeability implies all $X_i$ have the same distribution as $X_1$. In the following, the distribution of $X_1$ is arbitrary and does not need to have finite moments.


Fix $n$ as a positive integer. For $i \in \{1, ..., n\}$ define $$ Y_i = \left\{\begin{array}{cc} \frac{X_i-\overline{X}}{||X-e_n\overline{X}||_2} & \mbox{if $X-e_n\overline{X}\neq 0$} \\ 0 & \mbox{else} \end{array}\right.$$ where $e_n=(1, 1, ..., 1)$ is the all-1 vector in $\mathbb{R}^n$. Define event $A$ by $$ A = \{X - e_n\overline{X}\neq 0\}$$ Then we surely have

Observe that $\sum_{i=1}^n Y_i=0$ and$\{Y_i\}$ are random variables that surely satisfy
$\sum_{i=1}^n Y_i^2 = 1_A$. Assuming$$-1 \leq Y_i\leq 1 \quad \forall i \in \{1, 2, 3, ...\}$$ $$ \sum_{i=1}^n Y_i=0 \quad, \quad \sum_{i=1}^n Y_i^2=1_A$$ Boundedness of the $E[Y_i]$$Y_i$ variables ensures the first and $E[Y_i^2]$second moments are well defined and finite, we have by symmetry. The exchangeable property of $(X_1, ..., X_n)$ means that each $Y_i$ has the same probability distribution and so $$E[Y_i]=E[Y_1], \quad E[Y_i^2]=E[Y_1^2] \quad \forall i\in\{1, ..., n\}$$ so Therefore $$ \sum_{i=1}^nY_i=0 \implies nE[Y_1]=0 \implies E[Y_1]=0$$ $$\sum_{i=1}^nY_i^2 = 1_A \implies nE[Y_1^2]=P[A]\implies E[Y_1^2]=P[A]/n$$ and since $Var(Y_i)=E[Y_i^2]-0^2 = E[Y_1^2]$ we have $$\boxed{Var(Y_i) = P[A]/n \quad \forall i \in \{1, ..., n\}}$$ WhenIn the special case when $n\geq 2$ and the CDF of $X_1$ is$\{X_i\}$ random variables are i.i.d. with a continuous, CDF then it can be shown that $P[A]=1$ and, so   $Var(Y_i)=1/n$ for all $i \in \{1, ..., n\}$.

This just summarizes my comments which led to the answer that fennel gets (also in the comment section).


Fix $n$ as a positive integer. For $i \in \{1, ..., n\}$ define $$ Y_i = \left\{\begin{array}{cc} \frac{X_i-\overline{X}}{||X-e_n\overline{X}||_2} & \mbox{if $X-e_n\overline{X}\neq 0$} \\ 0 & \mbox{else} \end{array}\right.$$ where $e_n=(1, 1, ..., 1)$ is the all-1 vector in $\mathbb{R}^n$. Define event $A$ by $$ A = \{X - e_n\overline{X}\neq 0\}$$ Then we surely have $\sum_{i=1}^n Y_i=0$ and $\sum_{i=1}^n Y_i^2 = 1_A$. Assuming $E[Y_i]$ and $E[Y_i^2]$ are well defined and finite, we have by symmetry that $$E[Y_i]=E[Y_1], \quad E[Y_i^2]=E[Y_1^2] \quad \forall i\in\{1, ..., n\}$$ so $$ \sum_{i=1}^nY_i=0 \implies nE[Y_1]=0 \implies E[Y_1]=0$$ $$\sum_{i=1}^nY_i^2 = 1_A \implies nE[Y_1^2]=P[A]\implies E[Y_1^2]=P[A]/n$$ and since $Var(Y_i)=E[Y_i^2]-0^2 = E[Y_1^2]$ we have $$\boxed{Var(Y_i) = P[A]/n \quad \forall i \in \{1, ..., n\}}$$ When $n\geq 2$ and the CDF of $X_1$ is continuous, it can be shown that $P[A]=1$ and so $Var(Y_i)=1/n$ for all $i \in \{1, ..., n\}$.

This just summarizes my comments which led to the answer that fennel gets (also in the comment section). It generalizes to the case when the random variables $(X_1, ..., X_n)$ are exchangeable, meaning the joint distribution is invariant under permutations of the ordering, that is, for all permutations $(\sigma(1), ..., \sigma(n))$ of $\{1, ..., n\}$ and all $(x_1, \ldots, x_n)\in\mathbb{R}^n$ we have $$P[X_1\leq x_1, ..., X_n\leq x_n]=P[X_{\sigma(1)}\leq x_1, ..., X_{\sigma(n)}\leq x_n] $$ Note that i.i.d. implies exchangeable (so exchangeable is more general). Exchangeability implies all $X_i$ have the same distribution as $X_1$. In the following, the distribution of $X_1$ is arbitrary and does not need to have finite moments.


Fix $n$ as a positive integer. For $i \in \{1, ..., n\}$ define $$ Y_i = \left\{\begin{array}{cc} \frac{X_i-\overline{X}}{||X-e_n\overline{X}||_2} & \mbox{if $X-e_n\overline{X}\neq 0$} \\ 0 & \mbox{else} \end{array}\right.$$ where $e_n=(1, 1, ..., 1)$ is the all-1 vector in $\mathbb{R}^n$. Define event $A$ by $$ A = \{X - e_n\overline{X}\neq 0\}$$

Observe that $\{Y_i\}$ are random variables that surely satisfy
$$-1 \leq Y_i\leq 1 \quad \forall i \in \{1, 2, 3, ...\}$$ $$ \sum_{i=1}^n Y_i=0 \quad, \quad \sum_{i=1}^n Y_i^2=1_A$$ Boundedness of the $Y_i$ variables ensures the first and second moments are finite. The exchangeable property of $(X_1, ..., X_n)$ means that each $Y_i$ has the same probability distribution and so $$E[Y_i]=E[Y_1], \quad E[Y_i^2]=E[Y_1^2] \quad \forall i\in\{1, ..., n\}$$ Therefore $$ \sum_{i=1}^nY_i=0 \implies nE[Y_1]=0 \implies E[Y_1]=0$$ $$\sum_{i=1}^nY_i^2 = 1_A \implies nE[Y_1^2]=P[A]\implies E[Y_1^2]=P[A]/n$$ and since $Var(Y_i)=E[Y_i^2]-0^2 = E[Y_1^2]$ we have $$\boxed{Var(Y_i) = P[A]/n \quad \forall i \in \{1, ..., n\}}$$ In the special case when $n\geq 2$ and the $\{X_i\}$ random variables are i.i.d. with a continuous CDF then it can be shown $P[A]=1$, so   $Var(Y_i)=1/n$ for all $i \in \{1, ..., n\}$.

Source Link
Michael
  • 25.1k
  • 2
  • 31
  • 53

This just summarizes my comments which led to the answer that fennel gets (also in the comment section).


Fix $n$ as a positive integer. For $i \in \{1, ..., n\}$ define $$ Y_i = \left\{\begin{array}{cc} \frac{X_i-\overline{X}}{||X-e_n\overline{X}||_2} & \mbox{if $X-e_n\overline{X}\neq 0$} \\ 0 & \mbox{else} \end{array}\right.$$ where $e_n=(1, 1, ..., 1)$ is the all-1 vector in $\mathbb{R}^n$. Define event $A$ by $$ A = \{X - e_n\overline{X}\neq 0\}$$ Then we surely have $\sum_{i=1}^n Y_i=0$ and $\sum_{i=1}^n Y_i^2 = 1_A$. Assuming $E[Y_i]$ and $E[Y_i^2]$ are well defined and finite, we have by symmetry that $$E[Y_i]=E[Y_1], \quad E[Y_i^2]=E[Y_1^2] \quad \forall i\in\{1, ..., n\}$$ so $$ \sum_{i=1}^nY_i=0 \implies nE[Y_1]=0 \implies E[Y_1]=0$$ $$\sum_{i=1}^nY_i^2 = 1_A \implies nE[Y_1^2]=P[A]\implies E[Y_1^2]=P[A]/n$$ and since $Var(Y_i)=E[Y_i^2]-0^2 = E[Y_1^2]$ we have $$\boxed{Var(Y_i) = P[A]/n \quad \forall i \in \{1, ..., n\}}$$ When $n\geq 2$ and the CDF of $X_1$ is continuous, it can be shown that $P[A]=1$ and so $Var(Y_i)=1/n$ for all $i \in \{1, ..., n\}$.