0
$\begingroup$

$$ \newcommand{\N}{\mathbb N} $$

I am paraphrasing this textbook question slightly.

Question:

Let $(X_n)_{n \in \N}$ and $(Y_n)_{n \in \N}$ be two sequences of real random variables, and let $X$ and $Y$ be two real random variables.

  1. Does $X_n \to X$ and $Y_n \to Y$ (both in distribution) imply $(X_n, Y_n) \to (X, Y)$ in distribution?
  2. Show that 1. holds when $Y$ is constant almost surely.
  3. Show that 1. holds when $X_n$ and $Y_n$ are independent, and $X$ and $Y$ are independent.

My attempt:

I did 3., which is fairly easy if you use characteristic functions. However, I am stuck on 1 and 2.

For 1, I am pretty sure the answer is no, but I don't know what kind of counterexample to come up with. I figure I need to introduce some kind of dependence between $X_n$ and $Y_n$, and/or between $X$ and $Y$.

For 2, I think this can be done in some way via the portmanteau theorem. There is a related proof on Wikipedia (link), but it assumes $Y_n \to c$ in probability, not almost surely, and it seems kind of messy. (I realize assuming only convergence in probability makes it stronger, but it seems like it might not be what the author of this exercise had in mind.)

I appreciate any help.

$\endgroup$
1
  • $\begingroup$ The key thing to note is that convergence in distribution to a constant implies convergence in probability to the same. After that, you can use Slutsky's Theorem as stated in your link. And the answer by maxjw91 actually gives you the explicit argument that you need to use. $\endgroup$ Commented Jan 20 at 8:29

2 Answers 2

1
$\begingroup$
  1. If you let $N$ be a standard Gaussian and then set $X_n=Y_n=N$. Since $N$ is symmetric, then it is clear that $Y_n$ converges in distribution to $-N$. Therefore, if $(X_n,Y_n)$ converged in distribution to $(N,-N)$, then by continuity of $(x,y)\mapsto x+y$, $X_n+Y_n$ would converge in distribution to $0$ and hence $2N$ would be equal to $0$ in distribution, which is absurd.

  2. By the porte-manteau theorem, it is sufficient to prove that $\mathbb E[f(X_n,Y_n)]\rightarrow\mathbb E[f(X,Y)]$ for $f$ uniformly continuous and bounded. Let $y\in\mathbb R$ such that $Y=y$ a.s. Since the p.d.f. of $Y$ is continuous on $\mathbb R\backslash\{y\}$, then it is easily shown that $\forall\varepsilon>0$, $\mathbb P(|Y_n-y|>\varepsilon)\rightarrow0$ by using the fact that convergence in law implies pointwise convergence of the p.d.fs for all points of continuity of the limiting p.d.f. Therefore, $Y_n\overset{\mathbb P}{\rightarrow}y$. By the triangle inequality,

$$|\mathbb E[f(X_n,Y_n)]-\mathbb E[f(X,y)]|\le|\mathbb E[f(X_n,Y_n)]-\mathbb E[f(X_n,y)]|+|\mathbb E[f(X_n,y)]-\mathbb E[f(X,y)]|$$ It is clear that the second term converges to $0$ by continuity and boundedness of $x\mapsto f(x,.)$. Furthermore, for $\varepsilon>0$, we can choose $\delta_\varepsilon>0$ a module of $\varepsilon$-uniform continuity in the second variable of $f$ which yields : \begin{eqnarray*} |\mathbb E[f(X_n,Y_n)]-\mathbb E[f(X_n,y)]|&\le&\mathbb E[|f(X_n,Y_n)-f(X_n,y)|\mathbb 1_{\{|Y_n-y|\le\delta_\varepsilon\}}]+\mathbb E[|f(X_n,Y_n)-f(X_n,y)|\mathbb 1_{\{|Y_n-y|>\delta_\varepsilon\}}] \\ &\le&\varepsilon+2K\mathbb P(|Y_n-y|>\delta_\varepsilon) \end{eqnarray*} Where $|f|\le K$. Therefore, $\forall\varepsilon>0$, $\underset{n\rightarrow\infty}{\overline{\lim}}|\mathbb E[f(X_n,Y_n)]-\mathbb E[f(X_n,y)]|\le\varepsilon\implies \underset{\varepsilon\rightarrow0}{\lim}\underset{n\rightarrow\infty}{\overline{\lim}}|\mathbb E[f(X_n,Y_n)]-\mathbb E[f(X_n,y)]|=0$ and so the the first term also converges to $0$ which finishes the proof

$\endgroup$
1
$\begingroup$
  1. Let $X\sim N(0,1), X_n=X, Y_n=-X$. Then, $X_n \to X$ and $Y_n \to X$ in distribution but $(X_n,Y_n)=(X,-X)$ does not tend to $(X,X)$ in distribution. [If it does then, $X+(-X)=0$ would have the same distribution as $X+X$]

  2. Here, $Y_n$ tends to a constant $c$ in probability. By going to a subsequence we can reduce the proof to the case when $Y_n \to c$ a.s. [A.s. convergence is not essential. We can use the form of DCT where we have only convergence in measure in the proof that follows]. Now consider $Ee^{itX_n+is(Y_n-c)}$. We can write this as $E[e^{itX_n+is(Y_n-c)}-e^{itX_n}]+Ee^{itX_n}$. Note that $E|e^{itX_n+is(Y_n-c)}-e^{itX_n}|=E|e^{is(Y_n-c)}-1| \to 0$ by DCT. Thus, $Ee^{itX_n+is(Y_n-c)}\to Ee^{itX}$. Multiplying both sides by $e^{isc}$ finishes the proof.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .