22
$\begingroup$

Suppose $X_1,\ldots,X_n,Y_1,\ldots,Y_n$ are i.i.d $\mathcal N(0,1)$ random variables.

I am interested in the distribution of $$U=\frac{\sum_{i=1}^n X_iY_i}{\sum_{i=1}^n X_i^2}$$

I define $$Z=\frac{\sum_{i=1}^n X_iY_i}{\sqrt{\sum_{i=1}^n X_i^2}}$$

Then, $$Z\mid (X_1=x_1,\ldots,X_n=x_n)=\frac{\sum_{i=1}^n x_iY_i}{\sqrt{\sum_{i=1}^n x_i^2}}\sim \mathcal N(0,1)$$

As this conditional distribution is independent of $X_1,\ldots,X_n$, the unconditional distribution should also be the same. That is, I can say that $$Z\sim \mathcal N(0,1)$$

Relating $U$ and $Z$, I have $$U=\frac{Z}{\sqrt{\sum_{i=1}^n X_i^2}}$$

Now since I saw that $Z\mid (X_1,\ldots,X_n)\stackrel{d}{=}Z$, I can say that $Z$ is independent of $X_1,\ldots,X_n$.

So I have

$$U=\frac{1}{\sqrt n}\frac{Z}{\sqrt{\frac{\sum_{i=1}^n X_i^2}{n}}}=\frac{T}{\sqrt n}$$, where $T$ is distributed as a $t$ distribution with $n$ degrees of freedom.

I think conditioning is the easiest way to see the result here. But is this a perfectly rigorous argument and is there any direct/alternative way of finding distributions of such functions of linear combinations of i.i.d Normal variables?

$\endgroup$
12
  • $\begingroup$ I think your approach looks very reasonable to me. I didn't test it myself but it might be an alternative to start with Chi-square distribution in the denominator? $\endgroup$
    – TYZ
    Commented Jan 15, 2019 at 16:13
  • 8
    $\begingroup$ This looks just like the least squares estimator in a regression without constant. It is, under the assumptions given, known to be normally distributed with mean $\beta$ (that would be zero if the $Y$ are independent of the $X$, too) and variance covariance matrix $\sigma^2(X'X)^{-1}$. You'd arrive at a t-distribution if you also need to estimate $\sigma^2$. $\endgroup$ Commented Feb 7, 2019 at 9:41
  • 6
    $\begingroup$ Re your independence argument: Let $(X,Y)$ be standard bivariate Normal. Define $Z=|X|$ when $Y\ge 0$ and otherwise $Z=-|X|.$ You can easily check that the marginal distribution and conditional distributions of $Z$ (given $Y$) are standard Normal, but certainly $Z$ is not independent of $Y$! Equivalence in distribution is usually too weak to imply independence. $\endgroup$
    – whuber
    Commented Mar 14, 2019 at 18:41
  • 3
    $\begingroup$ @whuber the conditional distribution of Z given Y cannot be a standard normal: given Y, Z can only be of the same sign. $\endgroup$
    – a.arfe
    Commented May 19, 2019 at 5:16
  • 1
    $\begingroup$ In the special case of $n=1$ we have $X_1Y_1/X_1^2 = Y_1 / X_1$, and the ratio of two independent standard normals is Cauchy distributed. $\endgroup$
    – jochen
    Commented Jun 21, 2019 at 21:44

1 Answer 1

6
$\begingroup$

Although this is a conditional argument as well, using the characteristic function is faster: \begin{align*} \mathbb E\left[\exp\left\{ \iota t\sum_i Y_i X_i\Big/{\sum_j X_j^2}\right\}\right] &= \mathbb E\left[\left.\mathbb E\left[\exp\left\{\iota t Y_i X_i\Big/{\sum_j X_j^2}\right\}\right]\,\right|\,\mathbf X \right]\\ &=\mathbb E\left[\left.\mathbb E\left[\prod_i \exp\left\{\iota t Y_i X_i\Big/{\sum_j X_j^2}\right\}\right]\,\right|\,\mathbf X \right]\\ &=\mathbb E\left[\prod_i\left.\mathbb E\left[ \exp\left\{ t Y_i X_i\Big/{\sum_j X_j^2}\right\}\right]\,\right|\,\mathbf X\iota \right]\\ &=\mathbb E\left[\prod_i \exp\left\{- t^2 X_i^2 \Big/2\left\{\sum_j X_j^2\right\}^2\right\}\right]\\ &=\mathbb E\left[ \exp\left\{- t^2 \Big/2{\sum_j X_j^2}\right\}\right]\\ \end{align*} Invoking Wolfram's integrator, this expectation is equal to $$\int_0^∞ \zeta^{n/2 - 1} \frac{\exp(-\zeta - t^2/\zeta)}{Γ(n/2)}\ \text{d}\,\zeta = \frac{2 t^{n/2} K_{-n/2}(2 t)}{Γ(n/2)}$$ where $K_n$ is the modified Bessel function of the second kind. Hence, except for $n=1$ this is not the characteristic function of the Cauchy distribution. This looks instead like the characteristic function of the Student's $t$ distribution.

$\endgroup$
2
  • $\begingroup$ Do you use the independence of $X_i$ and $Y_i$ here? I don't seem to use it in my attempt, so I am thinking if this assumption changes anything. $\endgroup$ Commented Jan 29, 2020 at 15:14
  • $\begingroup$ Yes I do, when going from third to fourth line in the equation. $\endgroup$
    – Xi'an
    Commented Jan 29, 2020 at 18:35

Not the answer you're looking for? Browse other questions tagged or ask your own question.