$$\newcommand{\Pr}{\mathrm{Pr}}\newcommand{\Beta}{\mathrm{Beta}}\newcommand{\Var}{\mathrm{Var}}$$The distribution of the ith order statistic of any continuous random variable with a PDF is given by the "beta-F" compound distribution. The intuitive way to think about this distribution, is to consider the ith order statistic in a sample of $N$. Now in order for the value of the ith order statistic of a random variable $X$ to be equal to $x$ we need 3 conditions:
- $i-1$ values below $x$, this has probability $F_{X}(x)$ for each observation, where $F_X(x)=\Pr(X<x)$ is the CDF of the random variable X.
- $N-i$ values above $x$, this has probability $1-F_{X}(x)$
- 1 value inside a infinitesimal interval containing $x$, this has probability $f_{X}(x)dx$ where $f_{X}(x)dx=dF_{X}(x)=\Pr(x<X<x+dx)$ is the PDF of the random variable $X$
There are ${N \choose 1}{N-1 \choose i-1}$ ways to make this choice, so we have:
$$f_{i}(x_{i})=\frac{N!}{(i-1)!(N-i)!}f_{X}(x_{i})\left[1-F_{X}(x_{i})\right]^{N-i}\left[F_{X}(x_{i})\right]^{i-1}dx$$
EDIT in my original post, I made a very poor attempt at going further from this point, and the comments below reflect this. I have sought to rectify this below
If we take the mean value of this pdf we get:
$$E(X_{i})=\int_{-\infty}^{\infty} x_{i}f_{i}(x_{i})dx_{i}$$
And in this integral, we make the following change of variable $p_{i}=F_{X}(x_{i})$ (taking @henry's hint), and the integral becomes:
$$E(X_{i})=\int_{0}^{1} F_{X}^{-1}(p_{i})\Beta(p_{i}|i,N-i+1)dp_{i}=E_{\Beta(p_{i}|i,N-i+1)}\left[F_{X}^{-1}(p_{i})\right]$$
So this is the expected value of the inverse CDF, which can be well approximated using the delta method to give:
$$E_{\Beta(p_{i}|i,N-i+1)}\left[F_{X}^{-1}(p_{i})\right]\approx F_{X}^{-1}\left[E_{\Beta(p_{i}|i,N-i+1)}\right]=F_{X}^{-1}\left[\frac{i}{N+1}\right]$$
To make a better approximation, we can expand to 2nd order (prime denoting differentiation), and noting that the second derivative of an inverse is:
$$\frac{\partial^{2}}{\partial a^{2}}F_{X}^{-1}(a)=-\frac{F_{X}^{''}(F_{X}^{-1}(a))}{\left[F_{X}^{'}(F_{X}^{-1}(a))\right]^{3}}=-\frac{f_{X}^{'}(F_{X}^{-1}(a))}{\left[f_{X}(F_{X}^{-1}(a))\right]^{3}}$$
Let $\nu_{i}=F_{X}^{-1}\left[\frac{i}{N+1}\right]$. Then We have:
$$E_{\Beta(p_{i}|i,N-i+1)}\left[F_{X}^{-1}(p_{i})\right]\approx F_{X}^{-1}\left[\nu_{i}\right]-\frac{\Var_{\Beta(p_{i}|i,N-i+1)}\left[p_{i}\right]}{2}\frac{f_{X}^{'}(\nu_{i})}{\left[f_{X}(\nu_{i})\right]^{3}}$$
$$=\nu_{i}-\frac{\left(\frac{i}{N+1}\right)\left(1-\frac{i}{N+1}\right)}{2(N+2)}\frac{f_{X}^{'}(\nu_{i})}{\left[f_{X}(\nu_{i})\right]^{3}}$$
Now, specialising to normal case we have
$$f_{X}(x)=\frac{1}{\sigma}\phi(\frac{x-\mu}{\sigma})\rightarrow f_{X}^{'}(x)=-\frac{x-\mu}{\sigma^{3}}\phi(\frac{x-\mu}{\sigma})=-\frac{x-\mu}{\sigma^{2}}f_{X}(x)$$
$$F_{X}(x)=\Phi(\frac{x-\mu}{\sigma})\implies F_{X}^{-1}(x)=\mu+\sigma\Phi^{-1}(x)$$
Note that $f_{X}(\nu_{i})=\frac{1}{\sigma}\phi\left[\Phi^{-1}\left(\frac{i}{N+1}\right)\right]$ And the expectation approximately becomes:
$$E[x_{i}]\approx \mu+\sigma\Phi^{-1}\left(\frac{i}{N+1}\right)+\frac{\left(\frac{i}{N+1}\right)\left(1-\frac{i}{N+1}\right)}{2(N+2)}\frac{\sigma\Phi^{-1}\left(\frac{i}{N+1}\right)}{\left[\phi\left[\Phi^{-1}\left(\frac{i}{N+1}\right)\right]\right]^{2}}$$
And finally:
$$E[x_{i}]\approx \mu+\sigma\Phi^{-1}\left(\frac{i}{N+1}\right)\left[1+\frac{\left(\frac{i}{N+1}\right)\left(1-\frac{i}{N+1}\right)}{2(N+2)\left[\phi\left[\Phi^{-1}\left(\frac{i}{N+1}\right)\right]\right]^{2}}\right]$$
Although as @whuber has noted, this will not be accurate in the tails. In fact I think it may be worse, because of the skewness of a beta with different parameters