5
$\begingroup$

Consider an equilateral triangle centered at the origin of the 2D Cartesian space. Let the coordinates of its vertices be $v_1=(x_1,y_1)$, $v_2=(x_2,y_2)$ and $v_3=(x_3,y_3)$. All such triangles can be defined by the following conditions:

\begin{gather} ||v_1 - v_2|| = ||v_1 - v_3|| = ||v_2 - v_3|| & \label{eq_dist}\tag{1} \\ v_1 + v_2 + v_3 = (0, 0) & \label{sum_0}\tag{2} \\ \end{gather} Using these conditions it can be shown that $||v_1|| = ||v_2|| = ||v_3||$ (see proof here). Let $$r = ||v_1|| = ||v_2|| = ||v_3|| \label{eq_len}\tag{3}$$

My question is whether there is a simple algebraic explanation for the fact that the expression $$d_x \overset{\text{def}}{=} x_1 x_2 + x_1 x_3 + x_2 x_3$$ does not depend on the angle of rotation of the triangle about the origin. I made an interactive Desmos link where you can observe this behavior. Note that, in that link, I used $s$ as the radius instead of $r$, since $r$ is a special variable in Desmos.

I will proceed by listing and then detailing some of my findings:

  1. $v_1 \cdot v_2 + v_1 \cdot v_3 + v_2 \cdot v_3$ is also rotation-independent
  2. I found a trigonometric proof for my question
  3. This behavior also happens with a tetrahedron in $\mathbb{R}^3$ (see this Desmos link)

1) $v_1 \cdot v_2 + v_1 \cdot v_3 + v_2 \cdot v_3$ is also rotation-independent

This expression, although different than $d_x$, is closely related to it, since each individual dot product can be expressed as: $$v_k \cdot v_j = x_k x_j + y_k y_j$$

Which implies that $$\color{blue}{v_1 \cdot v_2} + \color{red}{v_1 \cdot v_3} + \color{green}{v_2 \cdot v_3} = (\color{blue}{x_1 x_2} + \color{red}{x_1 x_3} + \color{green}{x_2 x_3}) + (\color{blue}{y_1 y_2} + \color{red}{y_1 y_3} + \color{green}{y_2 y_3})$$

It is therefore handy to define \begin{align*} d_y & \overset{\text{def}}{=} y_1 y_2 + y_1 y_3 + y_2 y_3 \\ d & \overset{\text{def}}{=} v_1 \cdot v_2 + v_1 \cdot v_3 + v_2 \cdot v_3 \end{align*}

Which allows us to write $$d = d_x + d_y$$

This means that if $d$ and $d_y$ are rotation-independent, so is $d_x$. We can show that $d$ is rotation-independent by noticing that each of the three dot products is also rotation-independent. This happens because the dot product depends only on each vector's norm, and the angle between them - and none of those change when rotating the whole triangle about the origin. Nevertheless, I present a simple algebraic proof for this fact:

\begin{align*} ||v_1||^2 & = r^2 & \text{by (\ref{eq_len})} \\ ||- v_2 - v_3||^2 & = r^2 & \text{by (\ref{sum_0})} \\ (- x_2 - x_3)^2 + (- y_2 - y_3)^2 & = r^2 & \text{expand} \\ \color{blue}{x_2^2} + 2 x_2 x_3 + \color{red}{x_3^2} + \color{blue}{y_2^2} + 2 y_2 y_3 + \color{red}{y_3^2} & = r^2 & \text{expand} \\ \color{blue}{r^2} + \color{red}{r^2} + 2 x_2 x_3 + 2 y_2 y_3 & = r^2 & \text{by (\ref{eq_len})} \\ x_2 x_3 + y_2 y_3 & = - \frac{r^2}{2} & \text{simplify} \\ v_2 \cdot v_3 & = - \frac{r^2}{2} & \text{by definition of $\cdot$} \\ \end{align*}

By symmetry, this proof also applies to $v_1 \cdot v_2$ and $v_1 \cdot v_3$, and thus

\begin{gather} v_1 \cdot v_2 = v_1 \cdot v_3 = v_2 \cdot v_3 = - \frac{r^2}{2} \\ d = v_1 \cdot v_2 + v_1 \cdot v_3 + v_2 \cdot v_3 = - \frac{3}{2} r^2 \end{gather}

Although this is an interesting result, it is not enough, since we would also need to show that $d_y$ is rotation-independent. It could be the case that $d_x$ and $d_y$ both depended on the triangle's rotation but their sum did not - just like the cosine and sine functions depend on their argument but the sum of their squares does not. But if $d_x$ and $d_y$ were indeed rotation-independent (which I know is true, but am trying to find a simple algebraic proof for), since conditions (\ref{eq_dist}) and (\ref{sum_0}) are symmetrical in $x$ and $y$, we would have $$d_x = d_y = \frac{d}{2} = - \frac{3}{4} r^2 \label{dx_dy}\tag{4}$$

However, I was unable to prove this fact algebraically.

2) I found a trigonometric proof for my question

This proof is based in two observations:
  1. $v_1$, $v_2$ and $v_3$ lie in a circle centered at $(0, 0)$ (by \ref{sum_0}) and with radius $r$ (by \ref{eq_len})
  2. The angle between each pair of vertices (seen as vectors) is $\frac{1}{3}$ of a full turn, since the triangle is equilateral (by \ref{eq_dist})
These allow us to write

\begin{align*} v_1 = (x_1, y_1) & = (r \cos{\theta}, r \sin{\theta}) & \label{trig1}\tag{5.1} \\ v_2 = (x_2, y_2) & = (r \cos{(\theta + \frac{2 \pi}{3})}, r \sin{(\theta + \frac{2 \pi} {3})}) & \label{trig2}\tag{5.2} \\ v_3 = (x_3, y_3) & = (r \cos{(\theta - \frac{2 \pi}{3})}, r \sin{(\theta - \frac{2 \pi}{3})}) & \label{trig3}\tag{5.3} \end{align*}

where $\theta$ is an arbitrary real number. To simplify a computation below, we will now compute \begin{align} \cos(\alpha + \beta) \cos(\alpha - \beta) & = (\cos \alpha \cos \beta - \sin \alpha \sin \beta) (\cos \alpha \cos \beta + \sin \alpha \sin \beta) \\ & = \cos^2 \alpha \cos^2 \beta - \sin^2 \alpha \sin^2 \beta & \label{cos_prod}\tag{6} \end{align}

So, we have

\begin{align*} d_x & = x_1 x_2 + x_1 x_3 + x_2 x_3 & \text{by definition} \\ & = x_1(x_2 + x_3) + x_2 x_3 \\ & = x_1(- x_1) + x_2 x_3 & \text{by (\ref{sum_0})} \\ & = x_2 x_3 - x_1^2 \\ & = r^2 (\cos(\theta + \frac{2 \pi}{3}) \cos(\theta - \frac{2 \pi}{3}) - \cos^2 \theta) & \text{by (\ref{trig1}), (\ref{trig2}) and (\ref{trig3})} \\ & = r^2 (\cos^2 \theta \cos^2 \frac{2 \pi}{3} - \sin^2 \theta \sin^2 \frac{2 \pi}{3} - \cos^2 \theta) & \text{by (\ref{cos_prod})} \\ & = r^2 ((\cos^2 \theta) (- \frac{1}{2})^2 - (\sin^2 \theta) (\frac{\sqrt{3}}{2})^2 - \cos^2 \theta) \\ & = r^2 (\frac{1}{4} \cos^2 \theta - \frac{3}{4} \sin^2 \theta - \cos^2 \theta) \\ & = r^2 (- \frac{3}{4} (\cos^2 \theta + \sin^2 \theta)) \\ & = - \frac{3}{4} r^2 \end{align*}

This allows us to conclude that the equality (\ref{dx_dy}) is indeed true. However, we had to resort to trigonometry, whilst I was looking for an algebraic proof. Moreover, this trigonometric proof does not give me a good understanding of how conditions (\ref{eq_dist}) and (\ref{sum_0}) imply that $d_x$ only depends on $r$ and not on $\theta$. It seems that it could have been just a matter of luck that $x_1^2 = r^2 \cos^2 \theta$ and $x_2 x_3 = r^2 (\cos^2 \theta - \frac{3}{4})$. But looking at higher dimensions, the pattern seems to repeat, hinting that there must be something more intricate going on here.

3) This behavior also happens with a tetrahedron in $\mathbb{R}^3$

By considering a tetrahedron centered at the origin of the 3D Cartesian space, defining conditions analagous to (\ref{eq_dist}) and (\ref{sum_0}), and performing a trigonometric change of variables akin to (\ref{trig1}), (\ref{trig2}) and (\ref{trig3}), I was able to show that an expression analogous to $d_x$ (i.e., $\sum_{k < j}{x_k x_j}$) does not depend on the rotation of the tetrahedron about the origin. You can observe this behavior by playing around with this Desmos link.



That is it. Any insights on this problem would be appreciated, even if they don't directly answer my question. Thank you in advance.

$\endgroup$

7 Answers 7

3
$\begingroup$

If we write $v_{1} = r(\cos{\theta},\sin{\theta})$, then, since the triangle is equilateral, we must have $v_{2} = r(\cos{(\theta+\frac{2\pi}{3})},\sin{(\theta+\frac{2\pi}{3})})$ and $v_{3} = r(\cos{(\theta+\frac{4\pi}{3})},\sin{(\theta+\frac{4\pi}{3})})$, so what we want is

\begin{align*} d_{x} &= r\cos{\theta}\cdot r\cos{\left(\theta+\frac{2\pi}{3}\right)} + r\cos{\theta}\cdot r\cos{\left(\theta+\frac{4\pi}{3}\right)} + r\cos{\left(\theta+\frac{2\pi}{3}\right)}\cdot r\cos{\left(\theta+\frac{4\pi}{3}\right)} \\ &= r^2\left(\cos{\theta}\left(-\frac{\cos{\theta}}{2}-\frac{\sqrt{3}\sin{\theta}}{2}\right)+\cos{\theta}\left(-\frac{\cos{\theta}}{2}+\frac{\sqrt{3}\sin{\theta}}{2}\right)+\left(-\frac{\cos{\theta}}{2}-\frac{\sqrt{3}\sin{\theta}}{2}\right)\left(-\frac{\cos{\theta}}{2}+\frac{\sqrt{3}\sin{\theta}}{2}\right)\right) \\ &= r^2\left(-\cos^2{\theta}+\frac{\cos^2{\theta}}{4}-\frac{3\sin^2{\theta}}{4}\right) = -\frac{3r^2}{4} \end{align*}

$\endgroup$
1
  • $\begingroup$ Thank you, but that is the trigonometric solution I had already found, as stated in my post :) Nevertheless, your answer can still be useful for someone who is just looking for any kind of solution. $\endgroup$ Commented Apr 30 at 19:13
2
$\begingroup$

Here is a proof using an interpretation in (n+1)D space of the original question in nD space.

We will show it with $n=2$ : in this way, a graphical representation helps to understand the method, but it is generalizable to any value of $n$.

enter image description here

Let us consider generic point

$$M_t=\pmatrix{x\\y\\z}=\pmatrix{r \cos(t)\\ r \cos(t+\tfrac{2 \pi}{3})\\ \ r \cos(t+\tfrac{4 \pi}{3})}$$

Using addition formulas :

$$M_t=r \cos(t)\underbrace{\pmatrix{\ \ \ 1\\ - \tfrac12\\ -\tfrac12}}_U+r \sin(t) \underbrace{\pmatrix{0\\ -\tfrac{\sqrt{3}}{2}\\ \ \ \ \tfrac{\sqrt{3}}{2}}}_V\tag{1}$$

Please note that $U$ and $V$

  • both belong to plane $(P)$ (featured in plue) with equation $x+y+z=0$

  • constitute an orthogonal basis of this plane with $\|U\|=\|V\|=\tfrac{\sqrt{6}}{2}$.

Decomposition (1) means that point $M_t$ :

  • belongs to plane $(P)$.

  • describes a circle $(C_r)$ with radius $r' = \tfrac{\sqrt{6}}{2}r$ in this plane (featured in red in the figure).

Now, why is expression $xy+yz+zx$ naturally involved in this issue ?

For any $k<0$,

$$xy+yz+zx=k\tag{3}$$

is the equation of a (classical) quadric surface $(H_k)$ called "hyperboloid with one sheet" materialized as a set of green circles on the figure. It is a surface of revolution around the axis directed by unit vector $W:=\tfrac{1}{\sqrt{3}}(1,1,1)$ (orthogonal to plane $(P)$) (see explanations in the "Edit" below) with :

$$(H_k) \ \cap \ (P) \ = \ (C_r)\tag{2}$$

for a certain $k$.

Relationship (2) shows that circle $(C_r)$ is completely determined by the two constraints :

$$\begin{cases}x+y+z&=&0\\xy+yz+zx&=&k\end{cases}$$

Edit : Why is equation (3) the equation of a surface of revolution ? Let $(A)$ be the axis ($x=y=z$) directed by unit vector $W$.

Consider a point $M (x,y,z)$ of this surface. Let $M_A$, be the projection of $M$ onto axis $(A)$. As $W$ is a unit norm vector, distance $OM_A$ is obtained by computing the absolute value of the dot product :

$$OM_A=|\vec{OM}.\vec{W}|=\left|\frac{x+y+z}{\sqrt{3}}\right|$$

Consider Pythagoras identity :

$$MM_A^2=OM^2-OM_A^2=(x^2+y^2+z^2)-\left(\frac{x+y+z}{\sqrt{3}}\right)^2\tag{4}$$

Besides, as we have the identity :

$$x^2+y^2+z^2=-2(xy+yz+zx)+3\left(\frac{x+y+z}{\sqrt{3}}\right)^2\tag{5}$$

Relationship (4) can be written :

$$MM_A^2=-2(xy+yz+zx)+2\left(\frac{x+y+z}{\sqrt{3}}\right)^2\tag{6}$$

let $(P_K)$ be the plane with equation

$$x+y+z=\sqrt{3}K$$

which is orthogonal to axis $(A)$, at a distance $K$ from the origin.

The set of points $(H_k) \cap (P_K)$ is, taking into account (6) :

$$MM_A^2=-2k+2K^2 \ \ (a constant)$$

The square of the distances from $M$ to axis $(A)$ is a constant, proving that point $M$ belongs to a circle centered on axis $(A)$ (explaining why we have materialized the hyperboloid with circles). This establishes the fact that $(H_k)$ is a surface of revolution.

(end of Edit)

$\endgroup$
7
  • $\begingroup$ Thank you for your answer. I think I understood it, but can you just clarify something to me please? In my original problem, we had 2 conditions: three equidistant 2D points summing to $(0, 0)$. If I understood your $(n+1)$D analogy correctly, my original $x_1$, $x_2$ and $x_3$ values were translated into your $M_t = (x, y, z)$ 3D points. I understand that the coordinates of each $M_t$ must sum to $0$, but how is the "equal distances" condition encoded? And where are $y_1$, $y_2$ and $y_3$? I think it has to do with the way the coordinates of $M_t$ were chosen, but I'm not sure. $\endgroup$ Commented May 10 at 21:57
  • 1
    $\begingroup$ 1) The equal distances condition is encoded into the expression of the vector $(r \cos t, r \cos(t+2 \pi./3), \cos(t+4 \pi./3)) $ which is the proj. on $x$ axis of relation ship $re^{it}+ r e^{it+2 \pi/3} + r e^{it+4 \pi/3}=0$ expressing an equilateral triangle. 2) I don't need coord. $(y_1,y_2,y_3)$ but if one needs them they fullfill also $(y_1,y_2,y_3)=(r \cos t', r \cos(t'+2 \pi/3), \cos(t'+4 \pi/3)) $ with $t' \neq t$ in general. $\endgroup$
    – Jean Marie
    Commented May 11 at 6:20
  • $\begingroup$ Thank you for the clarification. The only part I still didn't completely understand was the final part with the hyperboloid with one sheet, as it is not a familiar surface for me. You claimed that there must be a $k < 0$ such that $(H_k) \cap (P) = (C_r)$ but it is not clear to me why this must be the case. How to find such $k$ given $r$? And what makes $xy+yz+zx=k$ be a surface of revolution around $x=y=z$? I feel like there are some assumptions in this proof that may seem obvious to a more experienced mathematician, but are not to someone less experienced like myself. $\endgroup$ Commented May 11 at 15:14
  • $\begingroup$ Your question is a good one. I will attempt to give details in an edit to my answer in some hours. $\endgroup$
    – Jean Marie
    Commented May 11 at 15:47
  • 1
    $\begingroup$ In fact, I haven't replaced $MM_A^2$ with $x^2+y^2+z^2$ but the explanation around wasn't clear : I have changed it. $\endgroup$
    – Jean Marie
    Commented May 12 at 15:50
1
$\begingroup$

Eventually, I found the solution I was looking for :)

Since $x_1 + x_2 + x_3 = 0$, we can write \begin{align*} d_x & = x_1 x_2 + x_1 (- x_1 - x_2) + x_2 (- x_1 - x_2) \\ & = x_1 x_2 - x_1^2 - x_2^2 - 2 x_1 x_2 \\ & = - (x_1^2 + x_1 x_2 + x_2^2) & \label{dx}\tag{7} \end{align*}

Also, as I proved in section 1) that all 3 pairwise dot products were equal to $- \frac{r^2}{2}$, we have \begin{align*} x_1 x_2 + y_1 y_2 & = - \frac{r^2}{2} \\ x_1 x_2 + \frac{r^2}{2} & = - y_1 y_2 & \text{rearrange} \\ x_1^2 x_2^2 + r^2 x_1 x_2 + \frac{r^4}{4} & = y_1^2 y_2^2 & \text{square both sides} \\ x_1^2 x_2^2 + r^2 x_1 x_2 + \frac{r^4}{4} & = (r^2 - x_1^2) (r^2 - x_2^2) & \text{by (3)} \\ \color{red}{x_1^2 x_2^2} + r^2 x_1 x_2 + \frac{r^4}{4} & = r^4 - (x_1^2 + x_2^2) r^2 + \color{red}{x_1^2 x_2^2} & \text{expand RHS} \\ x_1 x_2 + \frac{r^2}{4} & = r^2 - (x_1^2 + x_2^2) & \text{divide by } r^2 \\ - \frac{3}{4} r^2 & = - (x_1^2 + x_1 x_2 + x_2^2) & \text{rearrange} \\ d_x & = - \frac{3}{4} r^2 & \text{by (\ref{dx})} \end{align*}

I am still amazed that this $d_x$ expression does not depend on the triangle's rotation. If anyone can come up with a simpler explanation for this phenomenon(perhaps some geometric intuition?), and why it also works in 3D (and perhaps in higher dimensions), I would be thankful.

$\endgroup$
1
  • 2
    $\begingroup$ I added a solution for higher dimensions. $\endgroup$
    – Alex
    Commented May 6 at 5:42
1
$\begingroup$

For simplicity, we can consider that $r=1$. We can define the vertices of an $n$-regular polygon (where $n\geq 3$ as following):

$(x_k,y_k)=(\cos(\theta+k\frac{2\pi}{n}), \sin(\theta+k\frac{2\pi}{n})).$

Then $\sum_{k=1}^{n}\cos(\theta+k\frac{2\pi}{n})\cos(\theta+(k+1)\frac{2\pi}{n})=\frac{1}{2}\sum_{k=1}^{n}(\cos(\frac{2\pi}{n})+\cos(2\theta+\frac{2\pi}{n}+k\frac{4\pi}{n}))$.

Note that $\sum_{k=1}^{n}\cos(2\theta+\frac{2\pi}{n}+k\frac{4\pi}{n}))=\sum_{k=1}^{n}\cos(z_{\theta}+k\frac{4\pi}{n}))=\cos(z_{\theta})\sum_{k=1}^{n}\cos(k\frac{4\pi}{n})-\sin(z_{\theta})\sum_{k=1}^{n}\sin(k\frac{4\pi}{n})=0-0=0,\forall\theta\in[0,2\pi],n\geq 2$, because both sums have equivalently spaced arguments, i.e. are either real or imaginary part of $\sum_{k=1}^ne^{\frac{i4k\pi}{n}}=e^{\frac{i4\pi}{n}}\frac{1-e^{i4\pi}}{1-e^{\frac{i4\pi}{n}}}=0$.

As a conclusion, $d_x=\frac{n\cos(\frac{2\pi}{n})}{2},$ which is a constant.

We also have $d_y=\frac{n\cos(\frac{2\pi}{n})}{2}$ because $\sin(\theta+k\frac{2\pi}{n})=\cos(\theta-\frac{\pi}{2}+k\frac{2\pi}{n})=\cos(\phi+k\frac{2\pi}{n}),\forall k.$

$\endgroup$
4
  • $\begingroup$ Thank you, for your input. To make it clear, this does not directly answer my question, but it is interesting to know that it is valid for regular n-gons. $\endgroup$ Commented May 7 at 23:23
  • 1
    $\begingroup$ It does give an intuitive answer why $d_x$ vanishes to a constant. This is because your polygon is regular (equivalent spaced), and the dot product $d_x$ can be reduced into a summing of some equivalent spaced vertices, with a constant. What other insights are you looking for, specifically? $\endgroup$
    – Minh Khôi
    Commented May 8 at 2:21
  • $\begingroup$ Your answer is simple and @Alex 's is algebraic. I stated in my question that I was looking for a simple algebraic explanation (without resorting to trigonometry) but maybe there isn't one. :P But OK, I upvoted your answer due to it simplicity. I was also trying to find an intuitive why it also works in higher dimensions. $\endgroup$ Commented May 8 at 23:37
  • $\begingroup$ If you want to know whether the same reason holds for tetrahedron in higher dimension, you could define the tetrahedron using an $n$-dimensional sphere parametric curve to try. $\endgroup$
    – Minh Khôi
    Commented May 9 at 1:41
1
$\begingroup$

The rotation matrix, by angle $2\pi/3$ is given by

$$\begin{pmatrix} -\frac{1}{2}&\frac{\sqrt{3}}{2}\\ -\frac{\sqrt{3}}{2}&-\frac{1}{2} \end{pmatrix}$$ Hence, it follows that

$$x_2=-\frac{x_1}{2}+\frac{\sqrt{3}y_1}{2}$$ $$x_3=-\frac{x_1}{2}-\frac{\sqrt{3}y_1}{2}$$

So

$$\implies x_1x_3+x_2x_3+x_1x_2= x_1\left(-\frac{x_1}{2}-\frac{\sqrt{3}y_1}{2}\right)+\left(-\frac{x_1}{2}+\frac{\sqrt{3}y_1}{2}\right)\left(-\frac{x_1}{2}-\frac{\sqrt{3}y_1}{2}\right)+x_1\left(-\frac{x_1}{2}+\frac{\sqrt{3}y_1}{2}\right)$$

$$\require{cancel}=-\frac{x_1^2}{2}-\cancel{\frac{\sqrt{3}x_1y_1}{2}}+\frac{x_1^2}{4}-\frac{3y_1^2}{4}-\frac{x_1^2}{2}+\cancel{\frac{\sqrt{3}x_1y_1}{2}}$$ $$=-\frac{3}{4}(x_1^2+y_1^2)$$

which is therefore invariant under rotations.

$\endgroup$
1
  • $\begingroup$ Thank you, nice simple answer. :) I just won't mark it as the accepted one since it uses trigonometry implicilty in the construction of the rotation matrix. $\endgroup$ Commented May 10 at 22:04
1
$\begingroup$

There's a cute solution using some light complex analysis. Specifically, three points $z_1, z_2, z_3$ in the complex plane form the vertices of an equilateral triangle if and only if they are the solutions to the equation $$ z^3 = w $$ for some $w \in \mathbf{C}$.

Now, expanding $z = x + iy$, the above equation becomes \begin{align*} 0 &= (x + iy)^3 - w \\ &= x^3 + 3ix^2 y - 3xy^2 - iy^3 - w. \end{align*}

Taking the real part of both sides, we find that each $(x_1,y_1), (x_2,y_2), (x_3,y_3)$ must satisfy the polynomial equation $$ 0 = x^3 - 3xy^2 - \operatorname{Re}(w), $$ where $y^2 = r^2 - x^2$. That is, \begin{align*} 0 &= x^3 - 3x(r^2 - x^2) - \operatorname{Re}(w), \\ %0 %&= 4x^3 - 3r^2 x^3 - \operatorname{Re}(w), \\ 0 &= x^3 - \frac{3r^2}{4}x - \frac{\operatorname{Re}(w)}{4}. \end{align*}

But also, each $x_1, x_2, x_3$ satisfy the equation \begin{align*} 0 &= (x - x_1)(x - x_2)(x - x_3) \\ &= x^3 + (x_1 x_2 + x_2x_3 + x_1x_3)x - x_1x_2x_3, \end{align*} where the coefficient of $x^2$ is zero since it is $-(x_1 + x_2 + x_3)$.

Since these polynomials share the same roots and monic coefficient, they are equal, so equating coefficients of $x$ yields that $$ x_1 x_2 + x_2x_3 + x_1x_3 = -\frac{3r^2}{4}. $$

PS. This idea can be extended to conclude that for any regular $k$-gon, any elementary symmetric polynomial of the x-coordinates of the vertices is constant with respect to rotation of the polygon.

$\endgroup$
1
  • $\begingroup$ Thank you, cute solution indeed! This also seems to show that $x_1 x_2 x_3 = \frac{\operatorname{Re}(z^3)}{4} = \frac{r^3 \cos(3\theta)}{4}$ (with $z=r e^{i\theta}$), which is the same expression I got when I was researching a formula for 3rd degree polynomial equations (with real coefficients). Do you think this could be extended to higher dimensions? I am doubtful, since I don't know of a number system analogous to the complex numbers in $\mathbb{R}^3$, for instance. P.S.: In the first paragraph, I think you mean "an equilateral triangle [centered at the origin]". $\endgroup$ Commented May 14 at 11:01
0
$\begingroup$

We consider a tetrahedron in $n$ dimensional space that has $n+1$ coordinates.

We can construct the first $n$ coordinates of a tetrahedron in $n$ dimensional space as follows, where each $u_i\in \mathbb{R}^n$:

$u_1 = (1,0,0,0,...,0,0)^T$,

$u_2 = (0,1,0,...,0,0,0)^T$,

$\vdots$

$u_{n-1} = (0,0,0,...,0,1,0)^T$,

$u_{n} = (0,0,0,...,0,0,1)^T$,

The distance between any of these coordinates is

$||u_i-u_j|| = ||(0,0,...,0,1,0,...,0,-1,0,...,0,0)^T||= \sqrt{0^2+0^2+...0^2+1^2+0^2+...+0+(-1)^2+0+...+0+0)}= \sqrt{2}$

To construct the final coordinate we use the form $(a,a,...,a)$. The distance between $(a,a,...,a)$ and $u_i$ for $1\le i\le n$ is $d=\sqrt{(1-a)^2+(n-1)a^2}$.

Since this is a regular tetrahedron all of the distances between coordinates are the same and

$\sqrt{2}=\sqrt{(1-a)^2+(n-1)a^2}$ $2=(1-a)^2+(n-1)a^2$

and one solution to this quadratic formula in $a$ is:

$a=\frac{1+\sqrt{1+n}}{n}$

$u_{n+1} = (a,a,...,a)^T$

Now to center this tetrahedron, we can use:

$\bar{u} = \frac{\sum_{i=1}^n{u_i}}{n+1} = (c,...,c)$, where $c = \frac{a+1}{n+1} = \frac{\frac{1}{n}(1+\sqrt{1+n})+1}{n+1} = \frac{1+\sqrt{1+n}+n}{n(n+1)}$

So the centered tetrahedron is $v_1,v_2,...,v_n$, where

$v_1 = (1-c,-c,-c,...,-c,-c,-c)^T$,

$v_2 = (-c,1-c,-c,...,-c,-c,-c)^T$,

$\vdots$

$v_{n-1} = (-c,-c,-c,...,-c,1-c,-c)^T$,

$v_{n} = (-c,-c,-c,...,-c,-c,1-c)^T$,

and

$v_{n+1} = (a-c,a-c,...,a-c)$,

Lastly, we can normalize the tetrahedron size, so each coordinate $v_i$ is a distance $r$ from the origin.

For $1\le i\le n$,

$||v_i||=\sqrt{(n-1)c^2+(1-c)^2}=\sqrt{nc^2-c^2+1-2c+c^2}=\sqrt{nc^2-2c+1}$

Using $c = \frac{1+\sqrt{1+n}+n}{n(n+1)}$

$||v_i||=\sqrt{\frac{n}{n+1}}$

or for $ i = n+1$,

$||v_{n+1}||=\sqrt{n(a-c)^2}$

$c = \frac{a+1}{n+1}$, then $a=c(n+1)-1$

$||v_{n+1}||=\sqrt{n(c(n+1)-1-c)^2}=\sqrt{n(cn-1)^2}$

Using $c = \frac{1+\sqrt{1+n}+n}{n(n+1)}$

$||v_{n+1}||=\sqrt{\frac{n}{n+1}}$

So the normalized coordinates $w_1,...,w_{n+1}$ are

for $1\le i\le n$,

$w_1=\sqrt{\frac{n+1}{n}}(1-c,-c,-c,...,-c,-c,-c)^T$

$w_2=\sqrt{\frac{n+1}{n}}(-c,1-c,-c,...,-c,-c,-c)^T$

$\vdots$

$w_{n-1}=\sqrt{\frac{n+1}{n}}(-c,-c,-c,...,-c,1-c,-c)^T$

$w_{n}=\sqrt{\frac{n+1}{n}}(-c,-c,-c,...,-c,-c,1-c)^T$

and for $i=n+1$

$w_{n+1}=\sqrt{\frac{n+1}{n}}(a-c,...,a-c)^T$.

To confirm this is a tetrahedron, $||w_i-w_j||=\sqrt{\frac{2(n+1)}{n}}$ for any $i\ne j$, and

$w_i\cdot w_j = \frac{-1}{n}$ for any $i\ne j$.

Now consider an arbitrary (possibly improper) rotation matrix $\lambda$ for $n$ dimensions:

$\lambda = \begin{bmatrix} \lambda_{11} & \lambda_{12} & \dots & \lambda_{1n} \\ \lambda_{21} & \lambda_{22} & \dots & \lambda_{2n} \\ & & \vdots & \\ \lambda_{n1} & \lambda_{n2} & \dots & \lambda_{nn}\\ \end{bmatrix}$

Because a rotation matrix is orthogonal, we know that the row norms are 1:

$|| (\lambda_{i1},...,\lambda_{in})^T || = 1$

$for~ 1\le i\le n$

So an arbitrary centered and normalized tetrahedron is given by:

$\lambda w_1=$ $\sqrt{\frac{n+1}{n}} \begin{bmatrix} (1-c)\cdot \lambda_{1,1}-c\cdot \lambda_{1,2}-...-c\cdot \lambda_{1,n-1}-c\cdot \lambda_{1,n} \\ \vdots \\ (1-c)\cdot \lambda_{n,1}-c\cdot \lambda_{n,2}-...-c\cdot \lambda_{n,n-1}-c\cdot \lambda_{n,n}\\ \end{bmatrix}$

$\lambda w_2=$ $\sqrt{\frac{n+1}{n}} \begin{bmatrix} -c\cdot \lambda_{1,1}+(1-c)\cdot \lambda_{1,2}-...-c\cdot \lambda_{1,n-1}-c\cdot \lambda_{1,n}\\ \vdots\\ -c\cdot \lambda_{n,1}+(1-c)\cdot \lambda_{n,2}-...-c\cdot \lambda_{n,n-1}-c\cdot \lambda_{n,n}\\ \end{bmatrix}$

$\vdots$

$\lambda w_{n-1}=$ $\sqrt{\frac{n+1}{n}} \begin{bmatrix} -c\cdot \lambda_{1,1}-c\cdot \lambda_{1,2}-...+(1-c)\cdot \lambda_{1,n-1}-c\cdot \lambda_{1,n}\\ \vdots\\ -c\cdot \lambda_{n,1}-c\cdot \lambda_{n,2}-...+(1-c)\cdot \lambda_{n,n-1}-c\cdot \lambda_{n,n}\\ \end{bmatrix}$

$\lambda w_{n}=$ $\sqrt{\frac{n+1}{n}} \begin{bmatrix} -c\cdot \lambda_{1,1}-c\cdot \lambda_{1,2}-...-c\cdot \lambda_{1,n-1}+(1-c)\cdot \lambda_{1,n}\\ \vdots\\ -c\cdot \lambda_{n,1}-c\cdot \lambda_{n,2}-...-c\cdot \lambda_{n,n-1}+(1-c)\cdot \lambda_{n,n}\\ \end{bmatrix}$

$\lambda w_{n+1}=$ $\sqrt{\frac{n+1}{n}} \begin{bmatrix} (a-c)\cdot \lambda_{1,1}+...+(a-c)\cdot \lambda_{1,n}\\ \vdots\\ (a-c)\cdot \lambda_{n,1}+...+(a-c)\cdot \lambda_{n,n}\\ \end{bmatrix}$

Now we want to calculate $\sum_{i < j} x_ix_j$ for a space of dimension $n$.

From $w_1,...,w_{n+1}$, we note

$x_i=\sqrt{\frac{n+1}{n}}((1-c)\lambda_{1,i}-c\sum_{k\in\{1,...,n\}-i}\lambda_{1,k})=$

$\sqrt{\frac{n+1}{n}}(1-c)\lambda_{1,i}-\sqrt{\frac{n+1}{n}}c \sum_{k\in\{1,...,n\}-i}\lambda_{1,k}=$

$\sqrt{\frac{n+1}{n}}(1-c)\lambda_{1,i}-\sqrt{\frac{n+1}{n}}c \sum_{k=1}^n\lambda_{1,k}+\sqrt{\frac{n+1}{n}}c~\lambda_{1,i}=$

$\sqrt{\frac{n+1}{n}}\lambda_{1,i}-\sqrt{\frac{n+1}{n}}c \sum_{k=1}^n\lambda_{1,k}$

when $1\le i\le n$

and

$x_{n+1}=\sqrt{\frac{n+1}{n}}(a-c)\sum_{k=1}^n\lambda_{1,k}$

Now we solve for $\sum_{i < j} x_ix_j$.

\begin{align*} \sum_{i < j} x_ix_j=& \\ &\frac{1}{2}\sum_{i=1}^n\sum_{j=1}^n x_ix_j \\ &\frac{-1}{2}\sum_{i=1}^n x_i^2+ \\ &\sum_{i=1}^n x_ix_{n+1}=\\ &\\ &\\ &\frac{n+1}{2n}\sum_{i=1}^n\sum_{j=1}^n(\lambda_{1,i}-c\sum_{k=1}^n\lambda_{1,k})(\lambda_{1,j}-c\sum_{l=1}^n\lambda_{1,l}) +\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n (\lambda_{1,i}-c\sum_{k=1}^n\lambda_{1,k})(\lambda_{1,i}-c\sum_{l=1}^n\lambda_{1,l})+\\ &\frac{n+1}{n}\sum_{i=1}^n (\lambda_{1,i}-c\sum_{k=1}^n\lambda_{1,k})(a-c)\sum_{l=1}^n\lambda_{1,l}=\\ &\\ &\\ &\frac{n+1}{2n}\sum_{i=1}^n\sum_{j=1}^n\lambda_{1,i}\lambda_{1,j} +\\ &\frac{n+1}{2n}\sum_{i=1}^n\sum_{j=1}^n\lambda_{1,i}(-c)\sum_{l=1}^n\lambda_{1,l} +\\ &\frac{n+1}{2n}\sum_{i=1}^n\sum_{j=1}^n(-c)\sum_{k=1}^n\lambda_{1,k}\lambda_{1,j} +\\ &\frac{n+1}{2n}\sum_{i=1}^n\sum_{j=1}^n c^2\sum_{k=1}^n\lambda_{1,k}\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda_{1,i}(-c)\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n (-c)\sum_{k=1}^n\lambda_{1,k})\lambda_{1,i}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n c^2\sum_{k=1}^n\lambda_{1,k}\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{n+1}{n}\sum_{i=1}^n \lambda_{1,i}(a-c)\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{n+1}{n}\sum_{i=1}^n (-c)\sum_{k=1}^n\lambda_{1,k}(a-c)\sum_{l=1}^n\lambda_{1,l}=\\ &\\ &\\ &\frac{n+1}{2n}\sum_{i=1}^n\sum_{j=1}^n\lambda_{1,i}\lambda_{1,j} +\\ &\frac{n+1}{n}\sum_{i=1}^n\sum_{j=1}^n\lambda_{1,i}(-c)\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{n+1}{2n}\sum_{i=1}^n\sum_{j=1}^n c^2\sum_{k=1}^n\lambda_{1,k}\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}+\\ &\frac{-(n+1)}{n}\sum_{i=1}^n \lambda_{1,i}(-c)\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n c^2\sum_{k=1}^n\lambda_{1,k}\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{n+1}{n}\sum_{i=1}^n \lambda_{1,i}(a-c)\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{n+1}{n}\sum_{i=1}^n (-c)\sum_{k=1}^n\lambda_{1,k}(a-c)\sum_{l=1}^n\lambda_{1,l}=\\ &\\ &\\ &\frac{n+1}{2n}\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\frac{n+1}{n}(-c)\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\sum_{k=1}^n\lambda_{1,k} +\\ &\frac{n+1}{2n}c^2\sum_{i=1}^n\sum_{j=1}^n \sum_{k=1}^n\lambda_{1,k}\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}\\ &\frac{(n+1)}{n}c\sum_{i=1}^n \lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}+\\ &\frac{-(n+1)}{2n} c^2\sum_{i=1}^n\sum_{j=1}^n\lambda_{1,j}\sum_{k=1}^n\lambda_{1,k}+\\ &\frac{n+1}{n}(a-c)\sum_{i=1}^n \lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}+\\ &\frac{n+1}{n}(-c)(a-c)\sum_{i=1}^n \sum_{j=1}^n\lambda_{1,j}\sum_{k=1}^n\lambda_{1,k}=\\ &\\ &\\ &\frac{n+1}{2n}(1+2c+2(a-c))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}+\\ &\frac{n+1}{n}(-c)\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\sum_{k=1}^n\lambda_{1,k}+\\ &\frac{n+1}{2n}c^2\sum_{i=1}^n\sum_{j=1}^n \sum_{k=1}^n\lambda_{1,k}\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}+\\ &\frac{(n+1)}{2n}(-c^2-2c(a-c))\sum_{i=1}^n\sum_{j=1}^n\lambda_{1,j}\sum_{k=1}^n\lambda_{1,k}=\\ &\\ &\\ &\frac{n+1}{2n}(1+2c+2(a-c))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}+\\ &\frac{n+1}{n}(-c)\sum_{i=1}^n\lambda_{1,i}~n\sum_{k=1}^n\lambda_{1,k}+\\ &\frac{n+1}{2n}c^2n^2 \sum_{k=1}^n\lambda_{1,k}\sum_{l=1}^n\lambda_{1,l}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}+\\ &\frac{(n+1)}{2n}(-c^2-2c(a-c))~n\sum_{j=1}^n\lambda_{1,j}\sum_{k=1}^n\lambda_{1,k}=\\ &\\ &\\ &\frac{n+1}{2n}(1+2c+2(a-c))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\frac{n+1}{n}(-c)n\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\frac{n+1}{2n}c^2n^2 \sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}+\\ &\frac{(n+1)}{2n}(-c^2-2c(a-c))~n\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}=\\ &\\ &\\ &\frac{n+1}{2n}(1+2c+2(a-c)-2c n-c^2n^2-c^2n-2c(a-c)n)\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}+\\ &\frac{n+1}{2n}(1+2a-2c n-c^2n^2+c^2n-2can)\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}\\ \end{align*}

Using $c = \frac{a+1}{n+1}$

\begin{align*} \sum_{i < j} x_ix_j=& \\ &\frac{n+1}{2n}(1+2a-2\frac{a+1}{n+1}n-\\ &\frac{(a+1)^2}{(n+1)^2}n^2+\frac{(a+1)^2}{(n+1)^2}n-2\frac{a+1}{n+1}an)\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}+\\ &\\ &\frac{n+1}{2n}(\frac{(n+1)^2}{(n+1)^2}+ \frac{2a(n+1)^2}{(n+1)^2}+ \frac{-2(a+1)n(n+1)}{(n+1)^2}-\\ &\frac{(a+1)^2n^2}{(n+1)^2}+ \frac{(a+1)^2n}{(n+1)^2}- \frac{2(a+1)an(n+1)}{(n+1)^2})\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=\\ &\\ &\\ &\frac{1}{2n(n+1)}((n+1)^2+ 2a(n+1)^2+ -2(a+1)n(n+1)-\\ &(a+1)^2n^2+ (a+1)^2n- 2(a+1)an(n+1))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=\\ &\\ &\\ &\frac{1}{2n(n+1)}(n^2+2n+1+2an^2+4an+2a-2an^2-2an\\ &-2n^2-2n+a^2n^2+2an^2+n^2+a^2n\\ &+2an+n-2a^2n^2-2an^2-2a^2n-an)\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} \\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=\\ &\\ &\frac{1}{2n(n+1)}(1+2a+2an+n-a^2n^2-a^2n)\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=\\ &\\ &\frac{1}{2n(n+1)}(2a(1+n)+(1+n)-a^2(n^2+n))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=\\ &\\ &\frac{1}{2n(n+1)}((2a+1)(1+n)-a^2n(n+1))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}+\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=\\ &\\ &\frac{1}{2n(n+1)}((2a+1-a^2n)(1+n))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +\\ &\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=\\ \end{align*}

Using

$a=\frac{1+\sqrt{1+n}}{n}$,

$a^2=\frac{1+2\sqrt{1+n}+1+n}{n^2}=\frac{2+2\sqrt{1+n}+n}{n^2}$

$\sum_{i < j} x_ix_j=$

$\frac{1}{2n(n+1)}((\frac{2+2\sqrt{1+n}}{n}+1-\frac{2+2\sqrt{1+n}+n}{n})(1+n))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}+ $

$\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=$

$\frac{1}{2n(n+1)}((1-\frac{n}{n})(1+n))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j} +$

$\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=$

$\frac{1}{2n(n+1)}(0(1+n))\sum_{i=1}^n\lambda_{1,i}\sum_{j=1}^n\lambda_{1,j}+ $

$\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}=$

$\frac{-(n+1)}{2n}\sum_{i=1}^n \lambda^2_{1,i}$

Since $\lambda$ is orthogonal, $\sum_{i=1}^n \lambda^2_{1,i}=1$, and

$\sum_{i < j} x_ix_j=\frac{-(n+1)}{2n}$ for a tetrahedron with $n+1$ vertices in an $n$ dimensional space.

$\endgroup$
9
  • 1
    $\begingroup$ I can understand that you took a picture and uploaded it instead of re-writing this whole thing in MathJax :-) $\endgroup$
    – Dominique
    Commented May 6 at 8:21
  • 1
    $\begingroup$ Thank you very much for this purely algebraic solution! It is elegant that the result is just $-\frac{n+1}{2n}$. Note that this also implies that $\sum_{i=1}^n x_i^2 = \frac{n+1}{n}$. $\endgroup$ Commented May 7 at 23:15
  • 1
    $\begingroup$ @Dominique (I took the time now to convert to mathjax) $\endgroup$
    – Alex
    Commented May 7 at 23:48
  • 1
    $\begingroup$ @Tetrahydron $\sum_{i=1}^{n+1} x_i^2=\frac{n+1}{n}$, for example $n = 2$, $\sum_{i=1}^{3} x_i^2={cos}^2(0)+{cos}^2(\frac{2 \pi}{3})+{cos}^2(\frac{4 \pi}{3})=\frac{3}{2} = \frac{n+1}{n}$ $\endgroup$
    – Alex
    Commented May 8 at 0:09
  • $\begingroup$ @Alex You are correct, I forgot that in $n$ dimensions we had $n+1$ vertices. Meanwhile, I found a way to avoid having to compute $\sum_{i=1}^n \sum_{j=1}^n x_i x_j$ and $\sum_{i=1}^n x_i x_{n+1}$. Notice that $\sum_{i=1}^{n+1}x_i=0$ since it is the first coordinate of $\lambda\sum_{i=1}^{n+1}w_i = (0, ..., 0)^T$. So, $\sum_{i<j}x_i x_j = \frac{1}{2}(\sum_{i=1}^{n+1}x_i \sum_{j=1}^{n+1}x_j - \sum_{i=1}^{n+1}x_i ^2) = \frac{1}{2}(0\times 0 - \sum_{i=1}^{n+1}x_i ^2)$. Also note that I wrote $i<j$ instead of $i \ne j$ since the second counts every pair twice. I corrected my original post. $\endgroup$ Commented May 8 at 23:58

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .