2
$\begingroup$

Our lecturer said that if two random variables are independent, it should usually be "obvious" from their joined density. On the following examples, he then indeed proceeded to prove independence by guessing what the marginal densities would be.

I am not sure as to why that works, here's his reasoning (I tried following it as acurrately as I could):

Let's assume we can find two functions $g,h$ such that $$f_{X,Y}(x,y) = g(x) h(y) $$ Thus we have $$f_X(x) = \int f_{X,Y} (x,y) dy = g(x) \cdot c, c = \int h(y) dy$$ $$f_Y(y) = \int f_{X,Y} (x,y) dx = h(y) \cdot d, d = \int g(x) dx$$ Thus the marginal densities seem to equal to our guess up to a multiplicative constant, but $$f_X(x) f_Y(y) = g(x) h(y) \cdot cd = f_{X,Y} (x,y) \cdot cd= f_{X,Y} (x,y)$$ Because $$c\cdot d = \int h(y) dy \cdot \int g(x) dx = \int g(x) h(y) dxdy=1$$

Now, I generally follow, but I remain unconvinced/unsure, here are some additional questions

  1. Mainly, is this correct and true? Does simply separating the variables like that into individual functions prove independence?

  2. Is this a common approach? I've never encountered it and now he uses it multiple times per lecture, since usually he can just guess the functions right away.

  3. If I understand correctly, if I find $g,h$, I haven't found the marginal distributions (for that I'd have to find $c$ and $d$), but I have proven they are independent. Is that correct?

Thank you.

edit: Due to BCLC's answer, I'll just add this comment if further discussion should take place: I am fully aware of measure theory and of what he calls advanced probability theory (I mean, I am a begginer, but I do understand how to build the probability measure "from scratch")

$\endgroup$
6
  • $\begingroup$ Yes. Yes. Yes. $ $ $\endgroup$
    – Did
    Commented Nov 1, 2015 at 19:45
  • 1
    $\begingroup$ If you make that an answer, I'll gladly accept it. Of course, if someone writes some extra thoughts that would further clarify this, it'd be even better, but as the question stands, this is sufficient as an answer. $\endgroup$
    – Dahn
    Commented Nov 1, 2015 at 19:48
  • $\begingroup$ If you know what the joint density is then something may be obvious from the joint density. But how about this example involving the Pareto distribution: Suppose $\displaystyle f_X(x) = \alpha \kappa^\alpha / x^{\alpha+1}$ for $x>\kappa$, for some $\kappa>0$ and $\alpha>0$. Let $X_1,\ldots, X_n$ be an i.i.d. sample from this distribution, and let $X_{(1)},\ldots,X_{(n)}$ be the order statistics. Suppose $1\le r<s\le n$. Then $X_{(r)}$ and $X_{(s)}/X_{(r)}$ are independent. Is it obvious from the density that those are independent? Only if you know the density. ${}\qquad{}$ $\endgroup$ Commented Nov 19, 2015 at 19:46
  • 1
    $\begingroup$ @MichaelHardy forgive for being a bit slow, but I don't understand what the sentence "is it obvious from the density [...] only if you know the density". Sounds very tautological to me, what am I missing? $\endgroup$
    – Dahn
    Commented Nov 20, 2015 at 9:00
  • 1
    $\begingroup$ @DahnJahn : The point is that sometimes you don't know the density. In the example above, you know how $X_1,\ldots,X_n$ are distributed, and you know what function of those is $S^2$, but that doesn't mean you know the density of $S^2$, let alone the joint density of $\bar X$ and $S^2$. And similarly in my example involving the Pareto distribution. ${}\qquad{}$ $\endgroup$ Commented Nov 20, 2015 at 19:46

1 Answer 1

1
$\begingroup$
  1. Yes. In elementary probability, the definition of independence of random variables is that their pdfs (if they exist) split up ie

$$f_{X_1, \dots, X_n} (x_1, \dots, x_n) = \prod_{i=1}^{n} f_{X_i}(x_i) \tag{$*$} $$

Some books use cdfs instead ie

$$F_{X_1, \dots, X_n} (x_1, \dots, x_n) = \prod_{i=1}^{n} F_{X_i}(x_i) \tag{$**$} $$

$(*)$ and $(**)$ can be shown to be equivalent.

In advanced probability, that is not the definition, but the definition can be shown to be equivalent to the definition in elem prob (source):


enter image description here


In elem prob, $A$ and $B$ above are assumed to be subsets of $\mathbb R$. In adv prob, not all subsets of $\mathbb R$ work. The fact in measure theory is a corollary of a lemma called 'the uniqueness of extension'.

Again, assuming the random variables have pdfs, it can be shown that their pdfs split up iff their cdfs split up. QED

  1. In elem prob, yes.

  2. I think no. I know @Did said yes, and he is one of the probability experts of Math SE and will probably downvote this answer as soon as he sees it, but:

If indeed $\exists \ g, h$ s.t.

$$f_{X,Y}(x,y) = g(x) h(y)$$

then we can compute

$$f_X(x) = g(x) \ \int h(y) \, dy$$ and $$f_Y(y) = h(y) \ \int g(x) \, dx$$

$\endgroup$
6
  • 1
    $\begingroup$ Two things I don't understand. a) You say that two is common only elmentary probability, what does that mean? I mean, if you have such an elegant approach, why not use it in what you call "advanced probability" (also it's worth noting this course is exactly the course that presupposes measure theory and prerequisite knowledge of mt-based probability theory) b) How does the comment on the third statement contradict it? I don't see that. To me it seems that you just repeated my lecturer's reasoning. Thank you! $\endgroup$
    – Dahn
    Commented Nov 20, 2015 at 8:54
  • $\begingroup$ @DahnJahn A Not all RVs have PDFs. B You said if you know g and h you don't necessarily know the marginal PDFs but actually I think you do? $\endgroup$
    – BCLC
    Commented Nov 20, 2015 at 12:29
  • 1
    $\begingroup$ a) Ah, yes. In my mind you painted a world where, if you're advanced enough, you wouldn't use these cavemen techniques :) b) Ah, this clear things up! You and Did agree, it's just that you misunderstood the question - I am saying you know them immediately up to constants and those constants can be calculated exactly the way you said (please check my post again to see if you agree that we agree!) $\endgroup$
    – Dahn
    Commented Nov 20, 2015 at 13:07
  • 1
    $\begingroup$ Yes, that is exactly what was meant, glad it's resolved (and glad you don't have to argue with Did) $\endgroup$
    – Dahn
    Commented Nov 20, 2015 at 15:40
  • 1
    $\begingroup$ @DahnJahn Why? Have you seen him and I argue before? Hahahaha $\endgroup$
    – BCLC
    Commented Nov 20, 2015 at 16:02

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .