2
$\begingroup$

Let ${X_1,X_2,\dots}$ be a sequence of scalar random variables converging in probability to another random variable ${X}$. Suppose that there is a random variable ${Y}$ which is independent of ${X_i}$ for each individual ${i}$. Show that ${Y}$ is also independent of ${X}$.

By splitting into real and imaginary parts, we may assume without loss of generality that the $X_i$ and $X$ are real-valued. As

$$ \displaystyle {\bf P}(X \leq t \wedge Y \in S) = {\bf P}((|X-X_i|>\varepsilon \wedge X \leq t)) \wedge Y \in S) + {\bf P}((|X-X_i| \leq \varepsilon \wedge X \leq t)) \wedge Y \in S) $$

and the first term on the RHS tends to zero for large $i$, my crude first impression was to approximate the second term on the RHS by ${\bf P}(X_i \leq t \wedge Y \in S) = {\bf P}(X_i \leq t){\bf P}(Y \in S)$, and then the claim follows if one can show that ${\bf P}(X_i \leq t)$ approximates ${\bf P}(X \leq t)$.

Yet I struggles a bit to ground all these on a rigorous base, some helps or even a completely different approach will be appreciated.

$\endgroup$

2 Answers 2

1
$\begingroup$

To establish the desired independence, it suffices to show that $$ E[f(X)g(Y)] = E[f(X)]\cdot E[g(X)] $$ for all bounded continuous functions $f$ and $g$ mapping $\Bbb R$ to itself. Because $(X_n)$ converges in probability to $X$, there is a subsequence $1\le n(1)<n(2)<\cdots$ such that $\lim_kX_{n(k)}=X$ a.s. Then, by dominated convergence $$ E[f(X)g(Y)]=\lim_k E[f(X_{n(k)})g(Y)]=\lim_k E[f(X_{n(k)})]\cdot E[g(Y)]=E[f(X)]\cdot E[g(Y)]. $$ The assumed independence was used for the second equality above.

$\endgroup$
7
  • 1
    $\begingroup$ This is a nice proof, and I think we don't even have to pass onto a subsequence converging a.s for the argument to work, since dominated convergence in probability works just fine. $\endgroup$
    – shark
    Commented Feb 6 at 17:52
  • 1
    $\begingroup$ Hi: I didn't contribute to any of this but could one of you ( John, Yu or jwhite ), define the statements that come with dominated convergence. The one I am reading says that $|X_{n}| \le Y $ for all $n$ and $ E(Y) \lt \infty$. If that's the right one, then what is the "Y" that allows the use of dominated convergence. There seem to be different variants of DCT which I wasn't aware of until now. Thanks. $\endgroup$
    – mark leeds
    Commented Feb 7 at 7:23
  • $\begingroup$ You are correct; one can used the "convergence in probability" version of dominated convergence rather than the a.s. version. $\endgroup$ Commented Feb 7 at 19:06
  • $\begingroup$ @John Dawkins: Hi. In above, I think you were referring to Yu Lynn's comment. But, if you were referring to my question, I still don't understand what the $Y$ is. Maybe the "convergence in probability" version of DCT not have a $|X_{n}| \le Y$ condition ? I'll look for it and see. $\endgroup$
    – mark leeds
    Commented Feb 8 at 6:06
  • $\begingroup$ An easier way to state my question: What is dominating the sequence that allows you to use dominated convergence in your proof. I looked for different versions of the DCT but they all have this domination condition, which makes sense given the name of the theorem. Thanks a lot for your help. $\endgroup$
    – mark leeds
    Commented Feb 8 at 9:22
1
$\begingroup$

Since $X_i\overset{p}{\to}X$, there is a subsequence $(X_{i_n})_{n=1}^\infty$ of $(X_i)_{i=1}^\infty$ which converges almost surely to $X$. We can relabel to assume that the original sequence $(X_i)_{i=1}^\infty$ converges a.s. to $X$. Then for any $t\in\mathbb{R}$, $$(X\leqslant t)=\bigcap_{l=1}^\infty \bigcup_{m=1}^\infty \bigcap_{n=m}^\infty (X_n\leqslant t+1/l).$$ Since each $X_i$ is independent of $Y$, every member of the sigma algebra $\sigma(X_1,X_2,\ldots)$ generated by $X_1,X_2,\ldots$ is also independent of $Y$. We just showed that $(X\leqslant t)\in \sigma(X_1,X_2,\ldots)$, and is therefore independent of $Y$.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .