22
$\begingroup$

I've run into an application where I need to compute a bunch of elementary symmetric polynomials. It is trivial to compute a sum or product of quantities, of course, so my concern is with computing the "other" symmetric polynomials.

For instance (I use here the notation $\sigma_n^k$ for the $k$-th symmetric polynomial in $n$ variables), the Vieta formulae allow me to compute a bunch of symmetric polynomials all at once like so:

$$\begin{align*} &(x+t)(x+u)(x+v)(x+w)\\ &\qquad =x^4+\sigma_4^1(t,u,v,w)x^3+\sigma_4^2(t,u,v,w)x^2+\sigma_4^3(t,u,v,w)x+\sigma_4^4(t,u,v,w) \end{align*}$$

and, as I have said, $\sigma_4^1$ and $\sigma_4^4$ are trivial to compute on their own without having to resort to Vieta.

But what if I want to compute $\sigma_4^3$ only without having to compute all the other symmetric polynomials? More generally, my application involves a large-ish number of arguments, and I want to be able to compute "isolated" symmetric polynomials without having to compute all of them.

Thus, I'm looking for an algorithm for computing $\sigma_n^k$ given only $k$ and the arguments themselves, without computing the other symmetric polynomials. Are there any, or can I not do better than Vieta?

$\endgroup$
2
  • $\begingroup$ How would you use Vita to calculate the polynomials? Vita mixes them all up.. $\endgroup$ Commented Dec 14, 2014 at 18:43
  • $\begingroup$ See also cs.stackexchange.com/q/23683/755. $\endgroup$
    – D.W.
    Commented Oct 12, 2016 at 23:46

5 Answers 5

10
$\begingroup$

There's a dynamic programming algorithm that is $O(nk)$ using a recurrence relation. If we define $S_n^k = \sigma_n^k(x_1, \dots, x_n)$, then we have $$S_n^k = S_{n-1}^k + x_n S_{n-1}^{k-1}$$ which allows one to compute $S_n^k$ by filling an $n \times k$ matrix, where (almost) every cell takes constant time to fill.

(The base case is of course $S_n^0 = 1$ for all $n$ and $S_n^k = 0$ for all $n$ and $k \neq 0$.)

$\endgroup$
2
  • $\begingroup$ I think there's something wrong with your base case statement "... and $S_n^k=0$ for all $n$ and $k\ne0$". Maybe I don't understand the wording, but to me it reads like saying $S_n^k=0$ identically (except when $k=0$). $\endgroup$
    – a06e
    Commented Feb 17, 2023 at 23:45
  • $\begingroup$ Maybe you meant $S_n^k=0$ for $k > n$. $\endgroup$
    – a06e
    Commented Feb 18, 2023 at 12:18
9
$\begingroup$

You can use Newton-Girard formulae. The elementary symmetric polynomial have representation as determinants: $$ p_k(x_1,\ldots,x_n)=\sum_{i=1}^nx_i^k = x_1^k+\cdots+x_n^k \\ e_k(x_1,\ldots,x_n)=\sum_{1 \leq i_1<i_2<...<i_k\leq n}x_{i_1}x_{i_2}\cdots x_{i_k} $$ $$ e_k=\frac1{k!} \begin{vmatrix}p_1 & 1 & 0 & \cdots\\ p_2 & p_1 & 2 & 0 & \cdots \\ \vdots&& \ddots & \ddots \\ p_{k-1} & p_{k-2} & \cdots & p_1 & k-1 \\ p_k & p_{k-1} & \cdots & p_2 & p_1 \end{vmatrix} $$ We can compute this determinant using $O(k^2)$ additions and multiplications, using the following general result (from citation at end):

Let $A_n$ be an $n\times n$ lower Hessenberg matrix, meaning that $a_{ij}=0$ whenever $j\ge i+2$. For each $k\in \{1,\dots,n\}$, let $A_k$ be the $k\times k$ submatrix consisting of the upper $k$ rows and leftmost $k$ columns of $A_n$. Then for all $n\ge 1$,

$$\det A_n=a_{n,n}\det A_{n-1}+\sum_{r=1}^{n-1}(-1)^{n-r}a_{n,r}\prod_{j=r}^{n-1}a_{j,j+1}\det A_{r-1}$$ with the convention that $\det A_0=1$.

Since it takes $O(nk)$ to compute the list $p_1(x_1,\dots,x_n),\dots,p_k(x_1,\dots,x_n)$, the total complexity of this method is $O(nk+k^2)=O(nk)$ additions and multiplications.

Nathan D. Cahill, John R. D'Errico, Darren A. Narayan & Jack Y. Narayan (2002) Fibonacci Determinants, The College Mathematics Journal, 33:3, 221-225, DOI: 10.1080/07468342.2002.11921945

$\endgroup$
3
  • 1
    $\begingroup$ Do you also have a fast way of computing those determinants? The raw method takes O(n^2.4) operations. I suppose the near symmetry might help? or maybe not? Perhaps it can play into some sampling/approximation algorithm... $\endgroup$ Commented Dec 14, 2014 at 18:37
  • 1
    $\begingroup$ @Thomas, a lower Hessenberg matrix like the one in this answer can be decomposed in $O(n^2)$ operations; the determinant is then easily obtained from the triangular factor. Have a look at "Hyman's method", for instance. $\endgroup$ Commented Mar 13, 2016 at 9:21
  • $\begingroup$ Is there a numerically more stable computation method? For small and positive $x$'s, I would like to get compute the logarithm of $e_k$ instead of $e_k$. $\endgroup$
    – a06e
    Commented Feb 17, 2023 at 17:50
8
$\begingroup$

You can compute $\sigma^k_n(x_1,\dots,x_n)$ in $O(n \log^2 k)$ time, using FFT-based polynomial multiplication. The details are explained here and are apparently due to Ben-Or: https://cstheory.stackexchange.com/a/33506/5038.

Quote from that post, with notation changed to match this question.

Here's how. Introduce a formal unknown $y$, and consider the polynomial $$P(y) = \prod_{i=1}^n (1 + x_i y).$$ Note that since the $x_i$'s are known constants, this is a univariate polynomial with unknown $y$ and with degree $n$. Now you can note that the coefficient of $y^k$ in $P(y)$ is exactly $\sigma_n^k(x_1,\dots,x_n)$, so to evaluate all the $\sigma_n^0,\dots,\sigma_n^n$, it suffices to compute $P(y)$.

This makes it possible to compute $P(y)$ in $O(n \lg^2 n)$ time: build a balanced binary tree of polynomials with the $(1+x_i y)$'s at the leaves, and multiply the polynomials. Multiplying two polynomials of degree $d$ takes $O(d \lg d)$ time using FFT techniques, so we get the recurrence $T(n) = 2 T(n/2) + O(n \lg n)$, which solves to $T(n) = O(n \lg^2 n)$. For convenience, I am ignoring $\text{poly}(\lg \lg n)$ factors.

If you care about the case where $k$ is very small, you can compute $\sigma_n^0,\dots,\sigma_n^k$ in $O(n \lg^2 k)$ time using similar tricks, keeping in mind that you only care about $P(x) \bmod y^{k+1}$ (i.e., throwing away all terms of $y^{k+1}$ or higher powers of $y$).

This is asymptotically faster than any of the other methods proposed in any of the other answers.

Moreover, you can compute all of the values $\sigma^1_n(x_1,\dots,x_n), \sigma^2_n(x_1,\dots,x_n), \dots, \sigma^n_n(x_1,\dots,x_n)$ in just $O(n \log^2 k)$ time, using the same methods.

$\endgroup$
1
4
$\begingroup$

Let us use the symbols $u_1, u_2, ....$, for the indeterminates $t, u, v, ...$ in the question. The computation will be given in terms of a new set of indeterminates, $x_1, x_2, ....$, whose connection to the original indeterminates is given by:

$x_j = \sum_{i=1}^{n} u_i^j$

We define the derivation operator $\Delta$ acting on the new set of indeterminates as follows:

$\Delta x_j = j x_{j+1}$

$\Delta ab = a \Delta b + b \Delta a$

Then the $i$-th elementary symmetric polynomial is given by:

$\sigma_n^i = \frac{1}{i!}(x_1-\Delta)^{i-1}x_1$

The evaluation is performed in terms of the new indeterminates, after the evaluation, the expressions of the new determinates in terms of the original indeterminates need to be substituted.

$\endgroup$
2
  • $\begingroup$ Apllying $i=2$ gives $\frac12 (x_1x_1 - \Delta x_1)=\frac12 (x_1x_1 - 2 x_2)$, or did I miss something...? $\endgroup$
    – draks ...
    Commented Dec 23, 2012 at 22:53
  • $\begingroup$ How fast is this then? After expanding the $(x_1-\triangle)^{i-1}$ don't you still have to do $O(n^2)$ work? $\endgroup$ Commented Dec 14, 2014 at 18:51
2
$\begingroup$

A simple divide-and-conquer recursion looks like it comes in at $O(nk)$, just like the recurrence given by @BenKuhn. Split the variables into two halves, and inductively compute $\sigma_{n/2}^j$ for $j=0,\ldots,k$ evaluated on both halves. Iterate $r$ times, where $n\approx 2^r k$; the total work required is $2^r$ evaluations of $\{\sigma_k^j\mid 0\le j\le k\}$. But each of these sets can be done in $O(k^2)$ work using Vieta, so the total work is $O(2^r k^2)=O(nk)$.

It's probably worth remarking that you can evaluate $\sigma_n^{n-k}$ by evaluating $\sigma_n^k$ on the reciprocals of the variables. So you're in good shape if $k$ is either very small, or nearly $n$.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .