The textbook way can be justified by the following facts for mutually independent exponential random variables $\{W_i\}_{i=1}^{\infty}$ (with rate parameters $\{\lambda_i\}_{i=1}^{\infty})$ that "race" against each other:
The winning time is exponentially distributed:
$$\min[W_1, ..., W_n] \sim Exp(\lambda_1+...+\lambda_n)$$
The probability of winning is proportional to the rate parameter:
$$P[W_1<W_2] = \frac{\lambda_1}{\lambda_1+\lambda_2}$$
The winning time is independent of who wins:
$$\{W_1= min[W_1, ..., W_n]\} \quad \mbox{is independent of } \quad \min[W_1, ..., W_n]$$
So if $X,Y,Z$ are mutually independent exponential random variables with parameters $\lambda_X, \lambda_Y, \lambda_Z$, we can define $T=\min[Y,Z]$ and property 3 implies $T$ is independent of $\{Y<Z\}$. So $(X,T)$ is independent of $\{Y<Z\}$ so
\begin{align}
P[X<Y<Z] &= P[Y<Z]P[X<\min[Y, Z] | Y<Z] \\
&=P[Y<Z]P[X<T|Y<Z]\\
&=P[Y<Z]P[X<T]\\
&=\frac{\lambda_Y}{\lambda_Y+\lambda_Z}\cdot \frac{\lambda_X}{\lambda_X+\lambda_Y+\lambda_Z}
\end{align}
On the other hand a more direct way of approaching the problem, which would hold for any mutually independent $X,Y,Z$ that have PDFs (even if the random variables are not exponentially distributed), is to condition on all possible values of $Y$:
\begin{align}
P[X<Y<Z] &= \int_{-\infty}^{\infty} P[X<Y<Z|Y=y]f_Y(y)dy\\
&=\int_{-\infty}^{-\infty} P[X<y<Z|Y=y]f_Y(y)dy\\
&=\int_{-\infty}^{\infty}P[X<y<Z]f_Y(y)dy\\
&=\int_{-\infty}^{\infty} P[X<y]P[y<Z]f_Y(y)dy
\end{align}
Now if we use the properties of the exponential PDF we get
\begin{align}
P[X<Y<Z]&=\int_0^{\infty} (1-e^{-\lambda_Xy})e^{-\lambda_Zy}\lambda_Ye^{-\lambda_Yy}dy\\
&=\lambda_Y\int_0^{\infty} e^{-(\lambda_Y+\lambda_Z)y}dy - \lambda_Y\int_0^{\infty}e^{-(\lambda_X+\lambda_Y+\lambda_Z)y}dy\\
&=\frac{\lambda_Y}{\lambda_Y+\lambda_Z} - \frac{\lambda_Y}{\lambda_X+\lambda_Y+\lambda_Z}\\
&=\frac{\lambda_Y}{\lambda_Y+\lambda_Z}\cdot \frac{\lambda_X}{\lambda_X+\lambda_Y+\lambda_Z}
\end{align}
which is the same answer as before.