I am puzzled by a rather simple fact:
The correlation of a symmetric multivariate pdf seems to be bound from below (increasingly strong with the number of dimensions). That seems unlikely to me.
But I can't find any mistake in my reasoning:
Let $f$ be a symmetric joint pdf for the random variates $X_1, ... X_n$ all in $\mathbb{R}$: $$ \begin{align*} \forall i,j \in {1...n}:& \\ &f(\dots x_{i-1}, x_i, x_{i+1}, \dots x_{j-1}, x_j, x_{j+1}, \dots) = \\ &f(\dots x_{i-1}, x_j, x_{i+1}, \dots x_{j-1}, x_i, x_{j+1}, \dots) \end{align*} $$ Since $f$ is symmetric all the marginal distributions are equal $$ \begin{align*} f_{X_1}(x) &= \int f(x, x_2, \dots) \text{d} x_{2} \dots \text{d} x_{n} = \\ &= \int f(x_2, x, \dots) \text{d} x_{2} \dots \text{d} x_{n} = f_{X_2}(x) \end{align*} $$ and thus all of the following expectation values, variances and covariances are identical. $$ \begin{align*} \forall i,j \in {1...n}, i\ne j:& \\ \epsilon &:= \text{Exp}\left[X_i\right] \\ \nu &:= \text{Var}\left[X_i\right] \\ c &:= \text{Cov}\left[X_i, X_j\right] \\ \rho &:= \rho\left[X_i, X_j\right] = \frac{c}{\nu}\\ \end{align*} $$
The fun begins with $$ \begin{align*} \text{Var}\left[\sum_{i=1}^{n} X_i \right] &= \sum_i \text{Var}\left[X_i\right] + \sum_{i\ne j}\text{Cov}\left[X_i, X_j\right] = \\ &= n \nu + n(n-1)c = n\nu \left(1 + (n-1)\frac{c}{\nu}\right)= n\nu (1 + (n-1)\rho) \end{align*} $$ Since the variance is positive I get $$ \begin{align*} 0<& 1 + (n-1)\rho \\ \Leftrightarrow \rho >& -\frac{1}{(n-1)} \end{align*} $$
This can't be true?!
Edit
If so, wouldn't it mean that the more dimensions I have there is less and less freedom to find symmetric pdfs that are anti-correlated. What's the intuition behind this?