1
$\begingroup$

Let $X_1,\cdots,X_n$ be $n$ random variables on the same probability space $(\Omega, F, P)$, all with expectation $0$. Define $Y_i=X_i - \mathbb{E}[X_i | X_1, \cdots, X_{i-1}]$. Is it true that for any $X = a_1 X_1 + \cdots + a_n X_n$, there exist $b_1,\cdots,b_n$ such that $X = b_1 Y_1 + \cdots + b_n Y_n$, and vice versa?

If it is true, is there a textbook reference that I can cite? If it is not true, is there something similar that is true? I feel that $X_i$ and $Y_i$ must have some strong connections.

EDIT: The above statement is false. For example, let $X_2=X_1^3$, then $Y_1 = X_1$ and $Y_2 = 0$. In terms of the relationship between $(X_i)$ and $(Y_i)$, I only know that they generate the same $\sigma$-algebra. If someone can point out some other relationships, for example from an information theoretical perspective, then this would be really helpful.

$\endgroup$
1
  • $\begingroup$ Do not make open ended questions. $\endgroup$
    – William M.
    Commented Mar 14 at 18:21

1 Answer 1

1
$\begingroup$

Take $X_1=\cdots=X_n\neq0$. Then $Y_i=0$. So if $a_1+\cdots+a_n\neq0$ then $X=(a_1+\cdots+a_n)X_1\neq0=b_1Y_1+\cdots+b_nY_n$.

EDIT: as the author pointed out it is not a counterexample as we would have $Y_1=X_1$. I leave it here for the moment until we get a proper answer...

$\endgroup$
4
  • $\begingroup$ I think you mean to take $a_1+\dots+a_n \ne 0$ $\endgroup$
    – Alex Ortiz
    Commented Mar 13 at 19:27
  • 1
    $\begingroup$ But in this case $Y_1=X_1$ $\endgroup$
    – ryanstar
    Commented Mar 13 at 19:40
  • $\begingroup$ @AlexOrtiz yes I meant $\neq0$ thank you. $\endgroup$
    – Will
    Commented Mar 13 at 19:50
  • $\begingroup$ @ryanstar ah yes indeed $\endgroup$
    – Will
    Commented Mar 13 at 19:51

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .