4
$\begingroup$

If $X$ and $Y$ are independent Gamma random variables with parameters $(\alpha,\lambda)$ and $(\beta,\lambda)$ respectively, how to show that $U=X+Y$ and $V=X/(X+Y)$ are independent?

$\endgroup$
2
  • $\begingroup$ The standard procedure is to show that the conditional probability is the equidistributed to the original RV. $\endgroup$ Commented Dec 18, 2012 at 9:34
  • $\begingroup$ @MarioCarneiro. I know, but I think it is too complicated. I wonder if there any easy way? $\endgroup$
    – hxhxhx88
    Commented Dec 18, 2012 at 10:23

1 Answer 1

6
$\begingroup$

$U$ and $V$ are obtained from $X$ and $Y$ by the transformation \begin{eqnarray*} \left(\begin{array}{c} U\\ V \end{array}\right) & = & \left(\begin{array}{c} X + Y\\ \frac{X}{X + Y} \end{array}\right) \end{eqnarray*} giving rise to the inverse transformation \begin{eqnarray*} \left(\begin{array}{c} X\\ Y \end{array}\right) & = & \left(\begin{array}{c} UV\\ U \left( 1 - V \right) \end{array}\right) \end{eqnarray*} and the Jacobian of the transformation (which is the absolute value of determinant) $$ \left| J \right| = \left|\begin{array}{cc} V & U\\ 1 - V & - U \end{array}\right| = U $$ (because $U$ is a positive random variable).

Which implies that the joint density of $U$ and $V$ is \begin{eqnarray*} f_{U, V} \left( u, v \right) & = & u \times f_{X, Y} \left( uv, u \left( 1 - v \right) \right)\\ & = & u \times f_X \left( uv \right) \times f_Y \left( u \left( 1 - v \right) \right)\\ & = & u \times \frac{1}{\Gamma \left( \alpha \right) \lambda^{\alpha}} \left( uv \right)^{\alpha - 1} e^{- \frac{uv}{\lambda}}\\ & \times & \frac{1}{\Gamma \left( \beta \right) \lambda^{\beta}} \left( u \left( 1 - v \right) \right)^{\beta - 1} e^{- \frac{u \left( 1 - v \right) }{\lambda}}\\ & = & \frac{1}{\Gamma \left( \alpha + \beta \right)} u^{\alpha + \beta - 1} e^{- \frac{u}{\lambda}}\\ & \times & \frac{\Gamma \left( \alpha + \beta \right)}{\Gamma \left( \alpha \right) \Gamma \left( \beta \right)} v^{\beta - 1} \left( 1 - v \right)^{\alpha - 1}\\ & = & f_U \left( u ; \alpha + \beta, \lambda \right) f_V \left( v ; \beta, \alpha \right) \end{eqnarray*} The second line follows from the independence of $X$ and $Y$. The third equality comes from replacing the gamma distribution densities by their values. In the end you see that the joint density factors into the product of two marginal densities (one gamma and one beta). Because $f_{U,V}(u,v)=f_U(u) f_V(v)$ for every $(u,v)$, that implies independence of $U$ and $V$.

$\endgroup$
3
  • $\begingroup$ +1. At the end of the post, $f_{U,V}\ne f_Uf_V$. What you mean is that $f_{U,V}(u,v)=f_U(u)f_V(v)$ for every $(u,v)$, which can be written as $f_{U,V}=f_U\otimes f_V$ if one wishes to use a shorthand. $\endgroup$
    – Did
    Commented Dec 18, 2012 at 11:09
  • $\begingroup$ Thanks did, I will correct that point. $\endgroup$
    – Learner
    Commented Dec 18, 2012 at 11:10
  • $\begingroup$ This method is new to me, thank you very much! $\endgroup$
    – hxhxhx88
    Commented Dec 18, 2012 at 15:37

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .