Ok, Ok, I know that in fact the discriminant is defined (up to sign) as a product of differences of the roots of the polynomial.
But why does it then have integral coefficients, if the polynomial you started with had integer coefficients?
Ok, Ok, I know that in fact the discriminant is defined (up to sign) as a product of differences of the roots of the polynomial.
But why does it then have integral coefficients, if the polynomial you started with had integer coefficients?
The square of the discriminant is symmetric in the roots of the polynomial; if you permute the roots, it stays the same. By the fundamental theorem of symmetric functions, that means it can be expressed as a polynomial, with integer coefficients, in the coefficients of the original polynomial. (I am using both "coefficients" and "polynomial" in two senses here, so let me know if this doesn't make sense.) This is a basic observation which will become important if you ever study Galois theory.
An example will probably make this clearer. Suppose I have a quadratic polynomial $x^2 + bx + c$ with two roots $r_1, r_2$. Then $x^2 + bx + c = (x - r_1)(x - r_2)$, so $b = - r_1 - r_2$ and $c = r_1 r_2$. These are the elementary symmetric functions in two variables, and the theorem above implies that every polynomial function of $r_1$ and $r_2$ which is invariant under switching the two is actually a polynomial in $b$ and $c$. For example, the discriminant is
$$(r_1 - r_2)^2 = r_1^2 - 2r_1 r_2 + r_2^2 = (r_1 + r_2)^2 - 4r_1 r_2 = b^2 - 4c.$$
Another definition of discriminant is as the resultant of the polynomial with it's first derivative (up to a scalar), and the resultant of two polynomials vanishes if and only if they have a common root. So when does a polynomial have a common root with it's derivative? That's when a zero is also a local max or min, precisely a multiple root.