I am trying to learn about how to find the probability distribution for functions of random variables in general.
Suppose we have a random variable $X$ with pdf $f_X$ and a random variable $Y$ with pdf $f_Y$. Assuming there is no correlation between them, the following properties tend to hold:
- If Z = X + Y, then (convolution rule):
$$f_Z(z) = (f_X * f_Y)(z) = \int_{-\infty}^{\infty} f_X(x) f_Y(z - x) dx$$
- If we define $Y = g(X)$ as a new variable, then (change of variable rule):
$$f_Y(y) = f_X(g^{-1}(y)) \left|\frac{d}{dy}g^{-1}(y)\right|$$
My Question: Can the above rules be applied to any situation?
As an example, I wanted to find out the pdf of $X/Y$. I was not sure if the above rules could be applied to this problem.
I tried to work on this problem using general logic without knowledge of the above rules:
Define Z:
$$Z = \frac{X}{Y}$$
Write cumulative distribution function (CDF) of Z:
$$F_Z(z) = P(Z \leq z) = P\left(\frac{X}{Y} \leq z\right)$$
Split into two cases based on the sign of Y (i.e. a piecewised approach):
$$F_Z(z) = P(X \leq zY, Y > 0) + P(X \geq zY, Y < 0)$$
Express this using the joint PDF of X and Y:
$$F_Z(z) = \int_{0}^{\infty} \int_{-\infty}^{zy} f_{X,Y}(x,y) dx dy + \int_{-\infty}^{0} \int_{zy}^{\infty} f_{X,Y}(x,y) dx dy$$
differentiate with respect to z (pdf-cdf rule):
$$\begin{align} f_Z(z) &= \frac{d}{dz}F_Z(z) \\ &= \int_{0}^{\infty} y f_{X,Y}(zy,y) dy - \int_{-\infty}^{0} y f_{X,Y}(zy,y) dy \end{align}$$
Combine integrals:
$$f_Z(z) = \int_{-\infty}^{\infty} |y| f_{X,Y}(zy,y) dy$$
Since X and Y are independent:
$$f_Z(z) = \int_{-\infty}^{\infty} |y| f_X(zy) f_Y(y) dy$$
Have I done this correctly? Is it possible to apply the two rules I mentioned earlier in this problem? Or is another approach needed?
Thanks!