25
$\begingroup$

Suppose I am given the joint pdf of $X$, $Y$, and I am asked to find the $\operatorname{cov}(X,Y)$.

I know that $\operatorname{cov}(X,Y)=E(XY)-E(X)E(Y)$ and I know how to find $E(X)$ and $E(Y)$.

My questions are:

  1. What is the definition of $E(XY)$? Is it always equal to $$\int_{R\times R} xyf_X(x)f_Y(y)dxdy\,?$$ Or only if $X$, $Y$ are independent?(from the answer I have, the solution I have did not check the independence of $X$ and $Y$, and the answer $\operatorname{cov}(X,Y)$ is not zero, which proves $X$, $Y$ are not independent.)

  2. I remember, but not very clearly, that if the joint pdf of $X$, $Y$ $f_{X,Y}(x,y)$ can be written as $$f_{X,Y}(x,y)=g(x)h(y),$$ then $X$ and $Y$ are independent. Is it always true or need some conditions? I mean, suppose the region is not, say, $[0,1]\times[0,1]$, but, say, $0<x<1,x<y<2x$, is that saying still true?

Thank you so much!

$\endgroup$
7
  • $\begingroup$ 1. We have $E[XY]=\int_{\mathbb R\times\mathbb R}xyF(x,y)dx dy$ in general, where $F(\cdot,\cdot)$ is the cdf of $(X,Y)$. Your formula is true when $X$ and $Y$ are independent (and of course $X$ and $Y$ have a cdf). 2. You can check that $P(X\leq t_1,Y\leq t_2)=P(X\leq t_1)\cdot P(Y\leq t_2)$ thanks to the hypothesis. $\endgroup$ Commented Jan 4, 2012 at 20:02
  • 4
    $\begingroup$ $$E[g(X,Y)]=\int_{-\infty}^\infty\int_{-\infty}^\infty g(x,y)f_{X,Y}(x,y)dx dy$$ holds in general where $f_{X,Y}(x,y)$ is the joint pdf of $X$ and $Y$. Your integral in 1. is inncorrect. The equality $f_{X,Y}(x,y)=g(x)h(y)$ needs to hold at all points $(x,y)$ in the plane, not just at some points, in order for $X$ and $Y$ to be independent random variables. If the joint pdf is nonzero only for $0<x<1,x<y<2x$, then $X$ and $Y$ are dependent random variables; no need to try and see if you can express $f(x,y)$ as $g(x)h(y)$ $\endgroup$ Commented Jan 4, 2012 at 20:04
  • 2
    $\begingroup$ @DavideGiraudo You probably meant to write pdf instead of cdf? and also $P(X\leq t_1,Y\leq t_2)=P(X\leq t_1)P(Y\leq t_2)$ $\endgroup$ Commented Jan 4, 2012 at 20:06
  • $\begingroup$ OK, got it ,thank you so much! $\endgroup$ Commented Jan 4, 2012 at 20:07
  • $\begingroup$ @Dilip: You should post your comment as an answer, since that covers the OP's questions. $\endgroup$ Commented Jan 4, 2012 at 20:08

1 Answer 1

30
$\begingroup$

In general, for jointly continuous random variables $X$ and $Y$ with joint pdf $f_{X,Y}(x,y)$, $$E[g(X,Y)]=\int_{-\infty}^\infty\int_{-\infty}^\infty g(x,y)f_{X,Y}(x,y)dx dy.$$ In the special case you are considering, this becomes $$E[XY]=\int_{-\infty}^\infty\int_{-\infty}^\infty xyf_{X,Y}(x,y)dx dy.$$

If $X$ and $Y$ are jointly continuous random variables with joint pdf $f_{X,Y}(x,y)$, and $f_{X,Y}(x,y)$ factors into the product of the marginal pdfs $f_X(x)$ and $f_Y(y)$, then $X$ and $Y$ are said to be independent random variables. More useful is the reverse implication: if we assume that $X$ and $Y$ are independent continuous random variables with known pdfs (e.g. standard normal), then they are jointly continuous with joint pdf $f_{X,Y}(x,y)$ equal to the product $f_X(x)f_Y(y)$ of their individual pdfs.

Your expression $\displaystyle E[XY] = \int_{R\times R} xyf_X(x)f_Y(y)dxdy$ is incorrect in the general case, but is correct when $X$ and $Y$ are independent continuous random variables since $f_{X,Y}=f_X(x)f_Y(y)$ in this case. Indeed, if your expression were correct in general, then we would have $$E[XY] = \int_{R\times R} xyf_X(x)f_Y(y)dxdy = \int_{R} xf_X(x)dx \int_{R} yf_Y(y)dy = E[X]E[Y]$$ so that $\text{cov}(X,Y)=E[XY]-E[X]E[Y] = 0$ for all random variables, which is clearly not true. So we have the following.

If $X$ and $Y$ are independent random variables, then $E[XY]=E[X]E[Y]$.

Note that this holds for all random variables, not just continuous random variables. Also, as you probably know, the converse is not true: uncorrelated random variables need not be independent.

With regard to your second question, $X$ and $Y$ are independent if you can find nonnegative functions $g(x)$ and $h(y)$ such that the equality $f_{X,Y}(x,y)=g(x)h(y)$ holds at all points $(x,y)$ in the plane, not just at some points. If the joint pdf is nonzero only for $0<x<1,x<y<2x$, then $X$ and $Y$ are dependent random variables; no need to try and see if you can express $f(x,y)$ as $g(x)h(y)$.

Finally, note that all of the above applies provided the various integrals and expectations are defined or exist. $E[XY]=E[X]E[Y]$ does not apply to independent Cauchy random variables, for example, because $E[X]$ and $E[Y]$ are undefined for Cauchy random variables $X$ and $Y$.

$\endgroup$
0

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .