Let $X_1,\cdots,X_n$ be $n$ random variables on the same probability space $(\Omega, F, P)$, all with expectation $0$. Define $Y_i=X_i - \mathbb{E}[X_i | X_1, \cdots, X_{i-1}]$. Is it true that for any $X = a_1 X_1 + \cdots + a_n X_n$, there exist $b_1,\cdots,b_n$ such that $X = b_1 Y_1 + \cdots + b_n Y_n$, and vice versa?
If it is true, is there a textbook reference that I can cite? If it is not true, is there something similar that is true? I feel that $X_i$ and $Y_i$ must have some strong connections.
EDIT: The above statement is false. For example, let $X_2=X_1^3$, then $Y_1 = X_1$ and $Y_2 = 0$. In terms of the relationship between $(X_i)$ and $(Y_i)$, I only know that they generate the same $\sigma$-algebra. If someone can point out some other relationships, for example from an information theoretical perspective, then this would be really helpful.