If you are studying elementary probability theory, allow me to reformulate your question as "how can I represent a random variable $X$ with a given CDF $F_X$ in terms of a uniform random variable $U$ on $(0,1)$?" The answer to that is the quantile function: you define
$$G_X(p)=\inf \{ x : F_X(x) \geq p \}$$
and then define $X$ to be $G_X(U)$.
Note that if $F_X$ is invertible then $G_X=F_X^{-1}$, otherwise this is "the right generalization". One can see this by looking at the discrete case: if $P(X=x)=p$ then $P(G_X(U)=x)=p$. This is because a jump of height $p$ in $F_X$ corresponds to a flat region of length $p$ in $G_X$, and the uniform distribution on $(0,1)$ assigns each interval a probability equal to its length.
The natural question is now "what's a uniform random variable on $(0,1)$?" Well, it has $F_U(x)=\begin{cases} 0 & x<0 \\ x & x \in [0,1] \\ 1 & x>1 \end{cases}$. But otherwise such a thing is a black box from the elementary point of view.
If you are studying measure-theoretic probability theory then the answer is a bit more explicit. A random variable with CDF $F_X$ is given by $G_X : \Omega \to \mathbb{R}$ where $G_X$ is the quantile function as defined before, $\Omega=(0,1)$, $\mathcal{F}$ is the Borel $\sigma$-algebra on $(0,1)$, and $\mathbb{P}$ is the Lebesgue measure. Note that on this space the identity function is a uniform random variable on $(0,1)$, so this is really the same construction as the one described above.
In any case these constructions can be generalized to finitely many random variables by looking at the uniform distribution on $(0,1)^n$ instead of $(0,1)$.
given the cumulative distribution function find a random variable that has this distribution
means since if a cdf $F(x)$ is given for some random variable $X$ then for any $a$ and $b$ the probability $P(a < X \leq b)$ is nailed down by $F(b)-F(a)$ so $F(x)$ does indeed specify the rv a.e. $\endgroup$