$x_2$ is equal in distribution to $U_1 U_2$ where $U_1$ and $U_2$ are independent uniform$(0,1)$ random variables. It is very easy to find the distribution of $U_1 U_2$...
EDIT: Continuing this iteratively, so that $x_{n+1}$ is uniform on $(0,x_n)$, the infinite sum $x:=x_1+x_2+x_3+\cdots$ (which is finite with probability $1$, by the monotone convergence theorem) is equal in distribution to $U_1 + U_1 U_2 + U_1 U_2 U_3 + \cdots$, where the $U_i$ are independent uniform$(0,1)$ rv's. The distribution of $x$ is the Dickamn distribution.
EDIT: Here are three ways to compute the distribution function of $x_2$ (the first one is the direct approach). It is given, for $0 < x \leq 1$, by $F(x)=x - x \log x$; hence $x_2$ has probability density function $f(x)=-\log(x)$, $0 < x < 1$ (as leonbloy already found).
Approach 1): Since conditioned on $x_1 = s$, $0 < s < 1$, $x_2$ is uniformly distributed on $(0,s)$, the law of total probability gives, for $0 < x \leq 1$,
$$
{\rm P}(x_2 \le x) = \int_0^1 {{\rm P}(x_2 \le x|x_1 = s)ds} = \int_0^x {{\rm P}(x_2 \le x|x_1 = s)ds} + \int_x^1 {{\rm P}(x_2 \le x|x_1 = s)ds}
$$
$$
= \int_0^x {1ds} + \int_x^1 {\frac{x}{s}ds} = x - x\log x.
$$
Approach 2): $x_2$ is distributed as $U_1 U_2$, where $U_1$ and $U_2$ are independent uniform$(0,1)$ rv's. Hence,
$$
{\rm P}(x_2 \le x) = {\rm P}(U_1 U_2 \le x) = \int_0^1 {{\rm P}(U_1 U_2 \le x|U_1 = s)ds} = \int_0^1 {P\bigg(U_2 \le \frac{x}{s}\bigg)ds}
$$
$$
= \int_0^x {P\bigg(U_2 \le \frac{x}{s}\bigg)ds} + \int_x^1 {P\bigg(U_2 \le \frac{x}{s}\bigg)ds} =
\int_0^x {1ds} + \int_x^1 {\frac{x}{s}ds} = x - x \log x.
$$
(So, approaches 1) and 2) are quite similar: both rely on the law of total probability.)
Approach 3): Let $U_i$, $i=1,2$, be as above, and note that $-\log U_i$ is exponential$(1)$. Then, for any $0 < x \leq 1$,
$$
{\rm P}(x_2 \le x) = {\rm P}(U_1 U_2 \le x) = {\rm P}(\log U_1 + \log U_2 \le \log x) = {\rm P}( - \log x \le \xi_1 + \xi_2 ),
$$
where $\xi_1$ and $\xi_2$ are independent exponential$(1)$ rv's. The random variable $\xi_1+\xi_2$ has gamma density function
$ye^{-y}$, $y > 0$. Hence,
$$
{\rm P}(x_2 \le x) = \int_{ - \log x}^\infty {ye^{ - y} dy} = -e^{-y}(y+1) \big|_{ - \log x}^\infty = x - x \log x.
$$
This approach can be useful for determining the distribution of $U_1 \cdots U_n$, where $U_i$ are independent uniform$(0,1)$ rv's.