51
$\begingroup$

The question that follows was inspired by this question:

When trying to solve for the roots of a polynomial equation, the quadratic formula is much more simple than the cubic formula and the cubic formula is much more simple than the quartic formula.

  1. That the general solutions to various polynomial equations are so complex and difficult to derive seems to suggest a fundamental limitation in the problem solving capabilities of the mathematical machinery. Does this intuition of mine make any sense? What should I make of it?
  2. Why is it that with each successive degree in a polynomial equation, the solution becomes so much more complex? Can I gain some intuition about what makes finding the roots so hard?
  3. According to the Abel-Ruffini theorem: "there is no general algebraic solution—that is, solution in radicals— to polynomial equations of degree five or higher." What is so special about the quintic that makes it the cut-off for finding a general algebraic solution?
$\endgroup$
5
  • 2
    $\begingroup$ I think this should be closed - it's mentioned twice in the question you link to that there is no formula for polynomials of degree > 4, so asking how fast the complexity of the formulas grows is meaningless $\endgroup$ Commented Jul 27, 2010 at 19:33
  • 3
    $\begingroup$ it's not so hard to find roots numerically... $\endgroup$
    – Jason S
    Commented Jul 27, 2010 at 19:33
  • 2
    $\begingroup$ possible duplicate of Is there a general formula for solving 4th degree equations? $\endgroup$ Commented Jul 27, 2010 at 19:37
  • 1
    $\begingroup$ The article you linked to in 3 answers your question 3, and possibly your question 1 if I am understanding it correctly. If there is something about the proof you don't understand, that should be part of your question. Right now I don't really know what you're asking. $\endgroup$
    – Larry Wang
    Commented Jul 27, 2010 at 19:54
  • 12
    $\begingroup$ @Kaestur: the question is about gaining an intuition. Intuition about why this problem is so hard is very different than a proof that this problem is hard. I recognize that one possible answer to my question might be: it is impossible to gain any intuition about why quintic is unique, all we know how to do is prove it. I'm just curious to see if anyone can surprise me with a particularly elegant way of thinking about this problem. $\endgroup$
    – Ami
    Commented Jul 27, 2010 at 19:59

5 Answers 5

46
$\begingroup$

When you try to solve a degree $n$ equation, there are $n$ roots you have to find (in principle) and none of them is favoured over any of the others, which (in some metaphorical sense) means that you have to break an $n$-fold symmetry in order to write down the roots.

Now the symmetry group of the n roots becomes more and more complicated the larger $n$ is. For $n = 2$, it is abelian (and very small!); for $n = 3$ and $4$ it is still solvable (in the technical sense of group theory), which explains the existence of an explicit formula involving radicals (this is due to Galois, and is a part of so-called Galois theory); for $n = 5$ or more this group is non-solvable (in the technical sense of group theory), and this corresponds to the fact that there is no explicit formula involving radicals.

Summary: The complexity of the symmetry group of the $n$ roots leads to a corresponding complexity in explicitly solving the equation.

$\endgroup$
45
$\begingroup$

The idea is basically:

Any monic polynomial can be factored as $f(x) = \prod (x - a_i)$, where $a_{1,\dots,n}$ are the roots of the polynomial.

Now if we expand such a product:

$(x - a_1)(x - a_2) = x^2 - (a_1 + a_2)x + a_1a_2$ $(x - a_1)(x - a_2)(x - a_3) = x^3 - (a_1 + a_2 + a_3)x^2 + (a_1a_2 + a_1a_3 + a_2a_3)x - a_1a_2a_3$

And so on. The pattern should be clear.

This means that finding the roots of a polynomial is in fact equivalent to solving systems like the following:

For a quadratic polynomial $x^2 - px + q$, find $a_1,a_2$, such that

$p = a_1 + a_2$

$q = a_1a_2$

For a cubic polynomial $x^3 - px^2 + qx - r$, find $a_1,a_2,a_3$, such that

$p = a_1 + a_2 + a_3$

$q = a_1a_2 + a_1a_3 + a_2a_3$

$r = a_1 a_2 a_3$

And similarly for higher degree polynomials.

Not surprisingly, the amount of "unfolding" that needs to be done to solve the quadratic system is much less than the amount of "unfolding" needed for the cubic system.

The reason why polynomials of degree 5 or higher are not solvable by radicals, can be thought of as: The structure (symmetries) of the system for such a polynomial just doesn't match any of the structures that can be obtained by combining the structures of the elementary operations (adding subtracting, multiplication, division, and taking roots).

$\endgroup$
1
  • 3
    $\begingroup$ So solving a polynomial can be thought of as solving systems of equations? $\endgroup$ Commented Dec 26, 2015 at 13:52
11
$\begingroup$

For a different take, some practical problems are discussed in Wilkinson's classic article The Perfidious Polynomial. If you can't access it, check what Wikipedia has to say on the subject.

$\endgroup$
6
  • 3
    $\begingroup$ Jim Wilkinson's prize-winning article can be read here; briefly, one frequent source of trouble in numerical polynomial root finding is our habit of expressing polynomials in the monomial basis, and it happens that there are polynomials like $\prod_i (x-i)$ that numerically behave very poorly in rootfinding when expressed in the monomial basis. $\endgroup$ Commented Apr 20, 2011 at 8:51
  • 1
    $\begingroup$ @J.M. This link is broken. $\endgroup$ Commented Oct 25, 2014 at 23:48
  • 4
    $\begingroup$ I think I found it. maa.org/sites/default/files/pdf/upload_library/22/Chauvenet/… $\endgroup$
    – Neil
    Commented Apr 6, 2015 at 22:07
  • $\begingroup$ @I.J.Kennedy, Neil has given you a working link; thank him. $\endgroup$ Commented May 1, 2015 at 13:42
  • $\begingroup$ Unfortunately, both links are failing now (anyone have a working link?); so, I am not sure if that paper makes the point or not. But the blurb on Wikipedia somewhat alludes to some good intuition. Here's my take: with higher order polynomials, you can closely match almost any shape curve you want. HOWEVER, just very tiny changes in the coefficients can cause dramatic changes in the resulting curve, moving the roots dramatically. Such instability makes solving for those roots problematic... particularly in real-world scenarios where things are uncertain. $\endgroup$ Commented Jun 20, 2021 at 12:11
0
$\begingroup$

1.) Number of solutions, Number of coefficients

Let's consider polynomial equations with complex coefficients.
The increase of the number of solutions of polynomial equations with the degree of the equation is one reason for increased complexity of the solution formula.
The increase of the number of possible coefficients of polynomial equations with the degree of the equation is another reason for increased complexity of the solution formula.

2.) Possibility of closed-form solutions

The history of solving polynomial equations of one unknown began with solutions in radicals. But there are polynomial equations of degree > 4 that have no solutions in radicals.

Radicals are expressions with explicit algebraic operations. If you additionally allow all algebraic functions, $\exp$ and $\ln$, you allow all elementary expressions.
Chow [Chow 1999] gives his
Corollary 1:
"If Schanuel's conjecture is true, then the algebraic numbers in" the explicit elementary numbers "are precisely the roots of polynomial equations with integer coefficients that are solvable in radicals."
That means, the algebraic equations that are not solvable in radicals cannot be solved by elementary numbers (means by applying elementary functions).

There are different reasons for insolvability of polynomial equations in radicals (or in elementary numbers): e.g. algebraic independence, symmetry restrictions, topological restrictions.

All solutions of polynomial equations can be represented with help of transcendental functions, e.g. theta functions, elliptic functions, Bring radicals, exponential/elliptic modular functions, Siegel modular forms, hyperelliptic integrals.
$\ $

[Chow 1999] Chow, T.: What is a closed-form number. Am. Math. Monthly 106 (1999) (5) 440-448

$\endgroup$
0
$\begingroup$

This may only be tangentially related, since it refers specifically to Diophantine equations, but I think it speaks to the general type of complexity you start to invoke when you get up into quintic polynomials, and thus is worth mentioning.

Matiyasevich showed a while back that you can represent any recursively enumerable set as the set of solutions to a Diophantine equation. IIRC, this requires in the general case using polynomials with variables having at least degree 4. And what this means is that all of the complexity you could find in any computer program (including fundamentally unprovable / undecidable problems) can be packed into one of these equations.

For example, see this formula which enumerates the primes:

https://en.wikipedia.org/wiki/Formula_for_primes#Formula_based_on_a_system_of_Diophantine_equations

$\endgroup$
1
  • $\begingroup$ The complexity you refer to can be packed into a single equation but requires multiple (integer) variables. You are responding to a Question posed more than a decade ago, so presumably you see that its motivation (and the sense understood by the others who responded) is that of a single polynomial equation in one variable. It's a stretch to argue that solving problems of that kind can entrain the solution of multivariate Diophantine problems. $\endgroup$
    – hardmath
    Commented Jan 24, 2022 at 6:58

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .