28
$\begingroup$

This is solely a reference request. I have heard a few versions of the following theorem:

If the joint moment generating function $\mathbb{E}[e^{uX+vY}] = \mathbb{E}[e^{uX}]\mathbb{E}[e^{vY}]$ whenever the expectations are finite, then $X,Y$ are independent.

And there is a similar version for characteristic functions. Could anyone provide me a serious reference which proves one or both of these theorems?

$\endgroup$
3
  • 1
    $\begingroup$ I'm not at all sure that this is true. (Certainly it's true when "if" and "then" are interchanged.) $\endgroup$ Commented Jan 26, 2013 at 6:18
  • 1
    $\begingroup$ If this were true, every random variables with subexponential tails on both sides would be independent. Please reach a plausible statement. $\endgroup$
    – Did
    Commented Jan 26, 2013 at 10:57
  • 1
    $\begingroup$ In this post a theorem with a more general result was presented: math.stackexchange.com/questions/1802289/… $\endgroup$ Commented May 29, 2016 at 13:55

2 Answers 2

38
$\begingroup$

Theorem (Kac's theorem) Let $X,Y$ be $\mathbb{R}^d$-valued random variables. Then the following statements are equivalent.

  1. $X,Y$ are independent
  2. $\forall \eta,\xi \in \mathbb{R}^d: \mathbb{E}e^{\imath \, (X,Y) \cdot (\xi,\eta)} = \mathbb{E}e^{\imath \, X \cdot \xi} \cdot \mathbb{E}e^{\imath \, Y \cdot \eta}$

Proof:

  • $(1) \Rightarrow (2)$: Straightforward, use $\mathbb{E}(f(X) \cdot g(Y)) = \mathbb{E}(f(X)) \cdot \mathbb{E}(g(Y))$
  • $(2) \Rightarrow (1)$: Let $(\tilde{X},\tilde{Y})$ be such that $\tilde{X}$, $\tilde{Y}$ are independent, $\tilde{X} \sim X$, $\tilde{Y} \sim Y$. Then $$\mathbb{E}e^{\imath \, (X,Y) \cdot (\xi,\eta)} \stackrel{(2)}{=} \mathbb{E}e^{\imath \, X \cdot \xi} \cdot \mathbb{E}e^{\imath \, Y \cdot \eta} = \mathbb{E}e^{\imath \tilde{X} \cdot \xi} \cdot \mathbb{E}e^{\imath \tilde{Y} \cdot \eta} = \mathbb{E}e^{\imath (\tilde{X},\tilde{Y}) \cdot (\xi,\eta)}$$ i.e. the characteristic functions of $(X,Y)$ and $(\tilde{X},\tilde{Y})$ coincide. From the uniqueness of the Fourier transform we conclude $(X,Y) \sim (\tilde{X},\tilde{Y})$. Consequently, $X$ and $Y$ are independent.

Remark: It is not important that $X$ and $Y$ are vectors of the same dimension. The same reasoning works if, say, $X$ is an $\mathbb{R}^k$-valued random variable and $Y$ and $\mathbb{R}^d$-valued random variable.

Reference (not for the given proof, but the result):David Applebaum, B.V. Rajarama Bhat, Johan Kustermans, J. Martin Lindsay, Michael Schuermann, Uwe Franz: Quantum Independent Increment Processes I: From Classical Probability to Quantum Stochastic Calculus (Theorem 2.1).

$\endgroup$
13
  • $\begingroup$ Do we need $X,Y \in L^1$? I don't see it being used in the proof. $\endgroup$ Commented Jun 13, 2016 at 21:04
  • $\begingroup$ @takecare You are right; it's not needed for the proof. $\endgroup$
    – saz
    Commented Jun 14, 2016 at 5:38
  • 1
    $\begingroup$ @AnselB If $\mathbb{E}e^{i (X \xi + Y \eta)} = \mathbb{E}e^{i \xi X} \mathbb{E}e^{iY \eta}$ for all $\xi$, $\eta$, then $X$ and $Y$ are independent; that's exactly what the proof shows. If you prefer, think about like this: Denote by $\mathbb{P}_X$ and $\mathbb{P}_Y$ the distribution of $X$ and $Y$, respectively. Then the characteristic function of the product measure $\mu = \mathbb{P}_X \times \mathbb{P}_Y$ is given by $$\hat{\mu}(\xi,\eta) = \mathbb{E}e^{i \xi X} \mathbb{E}e^{i \eta Y}$$ which is, by assumption, also the characteristic function of $(X,Y)$. $\endgroup$
    – saz
    Commented Feb 4, 2017 at 6:29
  • 1
    $\begingroup$ @takecare Yes... if $$\mathbb{E}\exp \left( i \sum_{j=1}^n \xi_j X_j \right) = \prod_{j=1}^n \mathbb{E}\exp(i \xi_j X_j)$$ for any $\xi_1,\ldots,\xi_n$, then the random variables are independent. $\endgroup$
    – saz
    Commented Jun 4, 2017 at 4:45
  • 1
    $\begingroup$ @John I'm using characteristic functions (which is essentially nothing but the Fourier transform ... or the inverse Fourier transform, up to a constant, depending which definition you are using). Note that $\mathbb{R}^d \ni \xi \mapsto E(e^{i \xi X})$ is the characteristic function, not the moment generating function. $\endgroup$
    – saz
    Commented Nov 20, 2019 at 14:02
1
$\begingroup$

Builidng on the answer by saz. If X and Y have a joint density, here is another proof for (2)⇒(1): By the inverse Fouriour transform: $$f_\mathbf{X}(\mathbf{x})=\frac{1}{(2\pi)^n}\int_{R^n}{e^{-j\mathbf{v'x}}\phi_\mathbf{x}(\mathbf{v})d\mathbf{v}}$$

where x and v are vertical vectors, and in this case, vector $\mathbf{x} = [x\ y]'$, vector $\mathbf{v} = [v_1\ v_2]'$

Therefore, $$f_{XY}(x,y)=\frac{1}{(2\pi)^2}\iint{e^{-j(v_1x+v_2y)}\phi_{XY}(v_1,v_2)}dv_1dv_2\\=\frac{1}{2\pi}\int{e^{-j(v_1x)}\phi_{X}(v_1)}dv_1\frac{1}{2\pi}\int{e^{-j(v_2y)}\phi_{Y}(v_2)}dv_2\\=f_X(x)f_Y(y)$$

And the joint probability density function (pdf) equals to the product marginal pdf's is the definition of independence for continuous random variables. This method should work for discrete random variables as well.

$\endgroup$
2
  • $\begingroup$ Why do you assume $X$ and $Y$ have a joint density function? $\endgroup$ Commented Oct 13, 2019 at 19:45
  • $\begingroup$ You are right. I didn't think of that. I edited my answer correspondingly. $\endgroup$
    – Shuang Liu
    Commented Oct 13, 2019 at 21:04

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .