Let $X$ be a random variable uniformly distributed over a nontrival interval $[c,d]$, and let $Y = aX+b$. For what choice of real constants $a$ and $b$ is $Y$ uniformly distributed over [0,1]?
How could I fully comprehend this question? This is my interpretation so far:
$Y$ is a function of X. Does that mean the interval of $Y \subset X$? If yes, does it the interval of $Y$ MUST ALWAYS be a subset of $X$
From $Y = aX+b$ we know that the CDF of Y, $F_Y(y)$ = $$ P(Y\leq y) \\ P(aX+b \leq y) \\ P(X \leq \frac{y-b}{a}) $$ Hence, $F_Y(y)$ = $F_x(\frac{y-b}{a})$
So, for Y to be uniformly distributed over [0,1], $$a > 0 \\ y-b > 0 \\ b < y $$
Let me know if I have the right approach to the problem!