6
$\begingroup$

This question is purely out of curiosity and mainly to question my intuitions about independence of random variables.

Q: Take two non trivial random variables, with non disjoint support (see edit below) $X,Y \in \mathbb{L}_2(\Omega, \mathcal{F}, \mathbb{P})$, so that projections and covariance formulas are well defined. If $X$ is uncorrelated with any function $g$ of $Y$, i.e. $\operatorname{Corr}(X,g(Y)) = 0, \: \forall \: g$ measurable, this implies $X$ and $Y$ are independent.

Is the above statement true? I could not find any post on mathstack on this.

One way I tried to prove the above is by proving the following:

Assume that $X$ and $Y$ are dependent, then there exists a function $f$ such that the correlation between $X$ and $f(Y)$ is nonzero.

Reason why $\mathbb{L}_2$ is important:

This is also the reason why we have to take the random variables in $\mathbb{L}_2$, otherwise one could find counterexamples to the second statement by taking $X$ with undefined variance or expectation and show that the covariance can never be nonzero, as it is not well defined.

Thoughts:

Any ideas or references? Maybe something additional must be assumed about the functions $g$? Maybe instead of this, one should assume that the correlation is zero with any random variable $Z$ which is $X$-measurable?

Thank you very much for your help and time.

Reason why non disjoint support is important:

EDIT. Here I post a counterexample that contradicts the second statement, if we do not assume that the random variables have non disjoint support, i.e.:

Assume that $X$ and $Y$ are dependent, then there exists a function $f$ such that the correlation between $X$ and $f(Y)$ is nonzero.

Take $([0,1], \mathcal{B}([0,1]), \lambda)$, where $\lambda$ is the Lebesgue measure. The key idea is that if they have disjoint support we can find a counterexample. Take: $$ X(x) = \left(x - \frac{1}{2} \right) \mathbb{1}_{[0,1/2]}(x)$$ and: $$ Y(x) = \left(x - \frac{3}{2} \right) \mathbb{1}_{[1/2,1]}(x)$$ Take any function $f$, then $f(Y(x)) = f(0)$ constant for any $x \in [0,1/2]$ thus: $$ X(x)f(Y(x)) = f(0)X(x) \mathbb{1}_{[0,1/2]}(x)$$ which, as $\int X d\lambda = 0$, implies: $$\int X f(Y) d \lambda = 0$$ for any $f$. This implies they are uncorrelated and it provides a counterexample.

$\endgroup$
2
  • $\begingroup$ independence is $E[f(X)g(Y)] = E[f(X)]E[g(Y)]$ for all $f,g$ so you're doing just all $g$ and identity $f$ but as a substitute for all $f$ you're doing non-disjoint support and $\mathbb L_2$ ? $\endgroup$
    – BCLC
    Commented Dec 10, 2022 at 19:42
  • $\begingroup$ I guess your condition can be restated as saying that $E[X \mid Y]$ is constant. $\endgroup$ Commented Dec 10, 2022 at 19:44

1 Answer 1

3
$\begingroup$

If $Y$ has only two values (for instance, if $Y$ is a Bernoulli random variable), then whenever $X,Y$ are uncorrelated, we also have $X,g(Y)$ uncorrelated for every $g$. For when $Y$ only has two values $y_1, y_2$, then we can replace $g$ with a linear function: find constants $a,b$ with $g(y_1) = a y_1 + b$, $g(y_2) = a y_2 + b$, so that $g(Y) = aY+b$ almost surely. Then $\operatorname{Cov}(X,g(Y)) = a \operatorname{Cov}(X,Y)=0$.

For an explicit example, let $Y$ be Bernoulli(p) for any p, $\xi$ Rademacher(1/2) independent of $Y$, and $X=\xi Y$.

If you assume that $g(Y)$ and $Z$ are uncorrelated for every $Z \in \sigma(X)$, or equivalently that $f(X)$ and $g(Y)$ are uncorrelated for all functions $f,g$, then $X,Y$ are independent. For if not, there exist $A_1 \in \sigma(X)$, $A_2 \in \sigma(Y)$ which are not independent, which is to say that $1_{A_1}, 1_{A_2}$ are not uncorrelated. But by definition of $\sigma(X)$, there exists a Borel set $B_1$ with $1_{A_1} = 1_{B_1}(X)$, and likewise a $B_2$ with $1_{A_2} = 1_{B_2}(Y)$. So taking $f = 1_{B_1}$, $g = 1_{B_2}$, then $f(X), g(Y)$ are not uncorrelated.

$\endgroup$
3
  • $\begingroup$ Great answer! Thank you! I agree with your proofs. Do you by any chance know if some of the above statements hold if we assume that $X$ and $Y$ are absolutely continuous wrt the Lebesgue measure? I believe this will still not improve the situation. $\endgroup$ Commented Dec 10, 2022 at 20:21
  • 1
    $\begingroup$ @GrandesJorasses: I don't think so. For instance, $Y \sim U(0,1)$, $Z \sim U(-1,1)$ independent of $Y$, $X=YZ$. $\endgroup$ Commented Dec 10, 2022 at 20:26
  • $\begingroup$ Thank you a lot. This really helps. I find fascinating the concept of independence of random variables because it is mathematically tricky, even though heuristically clear. $\endgroup$ Commented Dec 10, 2022 at 21:00

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .