Let $X$ be a uniform distribution of inputs to be used for sampling. Let $f(x)$ be an expensive function. If we take samples from $X$ and give them as input to $f$ we get outputs $y_1, y_2, \ldots, y_n$ from some mystery distribution $Y$ with finite support. I wish to calculate the minimum of $Y$ with, say, 95% confidence. Alternatively, something like, say, the 5th percentile with 95% confidence.
The two methods I know of are Chebyshev's inequality and Wilks' method. For the Chebyshev's inequality, I don't know which variant to use. This one shown on Wikipedia appears to be the closest to what I want, but it is two-tailed. Wilks' method seems more promising, but it appears to be very conservative.
What method would you recommend?