The summit of what the other answers have grasped at, is not that "magic values" are bad, but that they ought to be:
- defined recognisably as constants;
- defined only once within their entire domain of use (if architecturally possible);
- defined together if they form a set of constants that are somehow related;
- defined at an appropriate level of generality in the application in which they are used; and
- defined in such a way as to limit their use in inappropriate contexts (e.g. amenable to type checking).
What typically distinguishes acceptable "constants" from "magic values" is some violation of one or more of these rules.
Used well, constants simply allow us to express certain axioms of our code.
Which brings me to a final point, that an excessive use of constants (and therefore an excessive number of assumptions or constraints expressed in terms of values), even if it otherwise complies with the criteria above (but especially if it deviates from them), may imply that the solution being devised is not sufficiently general or well-structured (and therefore we're not really talking about the pros and cons of constants anymore, but about the pros and cons of well-structured code).
High-level languages have constructs for patterns in lower-level languages which would have to employ constants. The same patterns can also be used in the higher-level language, but ought not to be.
But that may be an expert judgment based on an impression of all the circumstances and what a solution ought to look like, and exactly how that judgment will be justified will depend heavily on the context. Indeed it may not be justifiable in terms of any general principle, except to assert "I am old enough to have already seen this kind of work, with which I am familiar, done better"!
EDIT: having accepted one edit, rejected another, and having now performed my own edit, may I now consider the formatting and punctuation style of my list of rules to be settled once and for all haha!