5
$\begingroup$

I'm trying to find a the distribution for the power mean of $n$ random variables on $[0,1]$.

I've got the arithmetic mean: $\frac{n}{(n-1)!}\sum_{k=0}^{\lfloor nx\rfloor}(-1)^k\binom{n}{k}(nx-k)^{n-1}$

The geometric mean: $\frac{n}{(n-1)!}(-nx\log x)^{n-1}$

The min mean: $n(1 - x)^{n-1}$

The max mean: $nx^{n-1}$

From here they seam to get harder though, and there isn't really any pattern (or close to) yet. I'm wondering if the distribution of a general power mean has a name, and perhaps a closed formula?

Perhaps you can guide me to some literature, or know the closed form of e.g. the distribution of random variables over the harmonic mean?


I've plotted the n=3 case for different powers: graphs

$\endgroup$
2
  • 1
    $\begingroup$ Your formula for the geometric mean is incorrect. It should have an extra factor $nx^{n-1}$ I think. There's an easy way to derive it using the fact that if $X_i$ is distributed uniformly on $[0,1]$ then $-\log X_i$ is distributed exponentially with parameter $\lambda=1$. Then use the Erlang distribution for sums of exponential random variables. $\endgroup$ Commented Apr 16, 2011 at 12:27
  • 1
    $\begingroup$ @Raskolnikov You are right, there was also a small typo in the arithmetic one. They are fixed now. $\endgroup$ Commented Apr 16, 2011 at 19:22

1 Answer 1

3
$\begingroup$

There's a couple of different ways you can reformulate your problem and which might help you in finding some article on the issue. So, if you have $n$ uniformly distributed variables $X_i\sim U([0,1])$, then you are looking for the probability distribution of their power means:

$$\mathbb{P}\left[\left(\frac{1}{n}\sum_{i=1}^n X_i^p\right)^{\frac{1}{p}} \leq x\right]$$

which is the cumulative distribution, but the density can be found by taking the derivative with respect to $x$. This cumulative distribution can be rewritten as

$$\mathbb{P}\left[\sum_{i=1}^n X_i^p \leq n x^p\right]$$

provided $p>0$. But in that case, what we are looking at is the cumulative probability distribution of a sum of powers of i.i.d. random variables, in other words, moments. I'd be surprised if this hasn't been tackled in some way. You could also try to tackle it yourself from here. For instance by trying to find the following moment generating functions:

$$\mathbb{E}\left[e^{tX^p}\right] \; .$$

The nice thing about moment generating functions (or if you want you can also use characteristic functions) is that they are the Laplace transforms (resp. Fourier transforms) of the probability density functions. If you can inverse the transform, you might find an explicit expression for these functions.

Going a step further, note that if $Y\sim U([0,1])$ then

$$\mathbb{P}(Y^p \leq y) = \mathbb{P}(Y \leq \sqrt[p]{y}) = \sqrt[p]{y}$$

or in other words, $Y^p$ has density

$$f_{Y^p}(y) = \frac{1}{p} y^{\frac{1-p}{p}} \text{ for } y \in (0,1]$$

Then, your problem is one of looking for the distribution of a sum of i.i.d "generalized Pareto random variables". I'm kinda borrowing the term from here, although it doesn't fit in the same parameter range of the wiki example.

$\endgroup$
1
  • $\begingroup$ You are probably right, that the rest of them needs to be approximated from furrier series. Such a shame. $\endgroup$ Commented Apr 26, 2011 at 16:30

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .