As compressive notation: We can define the real numbers as distances along a chosen line. This is, in the end, completely circular and does not define anything. We already had the notion of the real numbers, or we would not be measuring things, but it makes a new way of saying things that is more compact when we are focussed on points rather than measures. An
An example from outside math of this kind of definition is defining "work" as the product of force and distance, just so we can refer to it efficiently.
Via distinction and equivalence: We can define the real numbers as the equivalence classes of convergent sequences of rational numbers the unions of which still converge. This brings up background questions. We have to establish that the condition on the equivalence satisfies the requirements of an equivalence relation. That can be work. If it failed, we would not have a definition. We would not have accidentally defined something else, other than the ordinary reals, we would in fact have a statement that if treated as a definition, would give us false proofs. An example from outside of math is the definition of "species". What defines the word is the ability to tell whether two animals are of the same species, and just that. It is important that the distinguishing criterion not get too vague or become ambivalent about a given pair of animals, or we have to start questioning the concept of species.
Via models of axioms: We can define real numbers as the minimal ordered field with consistent least upper bounds. Besides being oddly abstruse, this also pushes a horde of assumptions. It is the minimal model of this set of axioms, there are not thousands of unrelated models that all behave differently -- they all contain this one, and that takes a lot of proving. An example of this kind of definition from outside math is that of 'the alpha'. It picks out a specific kind of animal from a group by its behavior and that of those around it. The individual models an archetype or a stereotype made up of general rules.
An example from outside of math is the definition of "species". What defines the word is the ability to describe how to tell whether two animals are of the same species, and just that. (I don't mean the standards for each species, but the idea that those standards exist generally.) It is important that the distinguishing criterion not get too vague or become ambivalent about a too many pairs of animals, or we have to start questioning the concept of species itself.
- Via models of axioms: We can define real numbers as the minimal ordered field with consistent least upper bounds. Besides being oddly abstruse, this also pushes a horde of assumptions. It is the minimal model of this set of axioms, there are not thousands of unrelated models that all behave differently -- they all contain this one, and that takes a lot of proving.
An example of this kind of definition from outside math is that of 'the alpha of a troupe'. It picks out a specific kind of animal from a group by its behavior and that of those around it. The individual models an archetype or a stereotype made up of general rules. If we find different applications of the same checklist of features leads to disagreement too often, we have to discard the definition, and perhaps question the applicability of the concept itself.