1
$\begingroup$

I observe a sequence of r.v. $X_1, X_2, \dots$ where each $X_i$ is a function of the sample size $n$.

When $n \rightarrow \infty$ I have the following result: $X_1 \rightarrow^d E_1, X_2 \rightarrow^d E_2, \dots$ where $E_i$ are i.i.d random variable with mean $\mu$, variance $\sigma^2 < \infty$ and same density $f_E$. Moreover,

  • $$\lim_{n \rightarrow \infty}E(X_i) = \mu$$
  • $$\lim_{n \rightarrow \infty}V(X_i) = \sigma^2$$
  • $$\textrm{Cov}(X_i, X_j) = O(n^{-1})$$

Denote $\bar X_n = n^{-1}\sum_{i = 1}^n X_i$, can I claim that (Linderberg-Levy CLT type)

$$ \sqrt{n}\left(\bar{X}_{n}-\mu\right) \stackrel{d}{\rightarrow} \mathcal{N}\left(0, \sigma^{2}\right) ? $$

In other words: if the dependence of the elements of the summation fades only asymptotically, does the CLT still holds?

$\endgroup$
5
  • 3
    $\begingroup$ "X_1 converges in distribution to E_1" doesn't make sense as X_1 isn't a sequence, it's a single random variable. Same thing for X_2, X_3, etc. The question doesn't make sense as it stands. $\endgroup$
    – Paul
    Commented Aug 11, 2022 at 19:20
  • $\begingroup$ @Paul Strictly speaking you are correct, but let's help the new contributor,: they obviously imply that each $X_i$ is a function of $n$, hence the convergence result. Standard example: the $X$'s are in reality residuals from a regression with i.i.d. errors. Then each residual converges to the corresponding true error, and the question acquires meaning and substance: if dependence of the elements of the sum vanish only asymptotically, does the CLT still holds? Under perhaps additional conditions? $\endgroup$ Commented Aug 13, 2022 at 13:58
  • $\begingroup$ @AlecosPapadopoulos thank you! It is indeed my question. $\endgroup$
    – Eryna
    Commented Aug 13, 2022 at 14:15
  • $\begingroup$ I don't find the missing information obvious. The question cannot accept answers until it is edited to make sense. Now it seems to be OK so I voted to reopen. $\endgroup$
    – Paul
    Commented Aug 13, 2022 at 14:20
  • 3
    $\begingroup$ Now that the question makes sense, the problem here is that convergence in distribution is too weak and provides no insurance against correlations between the X_i. The condition that the E_i are independent does not help. In fact, the X_i could all be the exact same variable and still converge in distribution to "iid E_i". You will need a stronger form of convergence to get the X_i to become "more independent" as n increases. $\endgroup$
    – Paul
    Commented Aug 13, 2022 at 14:27

1 Answer 1

2
$\begingroup$

The problem here is that convergence in distribution is too weak and provides no insurance against correlations between the $X_i$. The condition that the $E_i$ are independent does not help. In fact, the $X_i$ could all be the exact same variable and still converge in distribution to the iid $E_i$. You will need a stronger form of convergence to get the $X_i$ to become less dependent as $n$ increases - convergence in probability at minimum, or perhaps something stronger.

The condition of $O(n^{-1})$ decaying correlation unfortunately doesn't help either, as the central limit theorem does not hold on merely uncorrelated random variables.

For what it's worth, some central limit theorems work under different assumptions than Lindeberg-Lévy, such as the martingale CLT.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.