19

Nearly every treatment of set theory, whether Paul Halmos' Naive Set Theory, Herbert Enderton's Elements of Set Theory, Patrick Suppes's Axiomatic Set Theory, etc., introduce a common set of logical connectives, namely "not" ¬, "inclusive or" , "and" , "implies , and "if and only if" (as well as the existential and universal quantifiers and ).

However, this set {¬,∨,∧,⇒,⇔} of connectives chosen:

  • is not minimal, i.e., is actually redundant (for example, P⇒Q could be written ¬P∨Q, and since we already have introduced ¬ and as part of our set, we could eliminate ).

  • is not exhaustive, since there are actually 16 possible compound statements (and corresponding logical connectives) to choose from. (Since {¬,∨,∧,⇒,⇔} is already redundant, why not throw in the other 11 connectives, some of which are VERY helpful like "nand" , "nor" and "exclusive or" ?)

Since it is neither minimal nor exhaustive, the set {¬,∨,∧,⇒,⇔} seems like an arbitrary choice.

Is there another explanation?

And since the set is already redundant (so there's no use in aiming to make it minimal), would it be acceptable to include the others that are missing, so we are at least making use of all 16 connectives (and have more at our disposal to work with)?

7 Answers 7

21

Is not exhaustive either, since there are actually 16 possible compound statements (and corresponding logical connectives) to choose from. (Since {¬,∨,∧,⇒,⇔} is already redundant, why not throw in the other 11 connectives, some of which are VERY helpful like "nand" ⊼ , "nor" ⊽ and "exclusive or" ⊻?)

Some of the "16 possible compound statements" are in fact trivial cases (and also the ¬ appears twice). Actually, only five of the sixteen cannot be made with one of the standard five operators. See the following table:

Table   Name                       Value for x..y
------------------------------------------------------
0000    Contradiction              False
0001    Conjunction                x ∧ y
0010    Nonimplication                      ¬(x ⇒ y)
0011    Left projection            x
0100    Converse nonimplication             ¬(y ⇒ x)
0101    Right projection           y
0110    Exclusive disjunction               ¬(x ⇔ y)
0111    Inclusive disjunction      x ∨ y
1000    Nondisjunction                      ¬(x ∨ y)
1001    Equivalence                x ⇔ y
1010    Right complementation      ¬y
1011    Converse implication       y ⇒ x
1100    Left complementation       ¬x
1101    Implication                x ⇒ y
1110    Nonconjunction                      ¬(x ∧ y)
1111    Affirmation                True

As you can see, only five (nonimplication, converse nonimplication, exclusive disjunction, nondisjunction, nonconjunction) are not in this 'standard set'. There are however books which also introduce ⊻, exclusive disjunction, as a standard operator.

I'm helping in a computing science course about basic math, and last week someone asked me:

Why do we have a symbol for ⊆ (subset), if we already have ⊂ (proper subset) and = (equality)? "a ⊆ b ≡ a ⊂ b ∨ a = b", so the operator is redundant.

I couldn't come up with a better answer than "Because mathematicians are lazy, and want to write things as short as possible". Clearly, that's jumping to conclusions - but in fact I think it's quite likely there is something true in there. One might ask, "why was × (multiplication) defined?", because in the natural numbers you can simply add: 5 × 3 = 5 + 5 + 5 = 3 + 3 + 3 + 3 + 3. Going further, you can ask, "Why were 2 and 3 defined, if you can also write 1+1 and 1+1+1?" At some point, it's really to much work to write everything down, hence more notation was introduced.

Of course, you are allowed to define your own notation. By defining nonimplication, exclusive disjunction, nondisjunction and nonconjunction, you have an exhaustive set. Define the ones you need often at the top of your writing.

So, how did we get to this standard set of logical operators? By using them, and finding out which ones we need often. Also note that the five statements that don't exist can all be formed by negating another operator (see the fourth column in the table above) and that this is not possible if you leave any of the 'standard' five out.

4
  • 3
    Most of this is excellent but "laziness" is a terrible explanation. Attributing the use of ⊆ to laziness suggests that it would actually be better to use the minimal possible number of symbols in a text, which just isn't true. We use ⊆ it matches the underlying concept, which is that of "subset". "Proper subset" is a special case, most easily defined in terms of "subset". Also, ⊆ is overwhelmingly more common than ⊂, so it makes sense to have and use a symbol for that concept. If you forced me never to write one of those symbols again, I'd drop ⊂ and start writing "A⊆B ∧ A≠B". Commented Sep 8, 2015 at 8:24
  • 1
    @DavidRicherby well, maybe laziness is not the right term. But obviously we gain clarity in some cases by using a dedicated symbol. Clearly my answer as I quoted it above is jumping to conclusions, that's why it should be related to the paragraph below; "there is something true in there". I will make that more clear, thanks.
    – user2953
    Commented Sep 8, 2015 at 8:49
  • 1
    That's a good answer but the example of multiplication is unfortunate. How do you write "a x b" with addition when a and b are variables? Multiplication is a concept with many implications, not a scripture we use because we're lazy. Commented Sep 12, 2015 at 19:22
  • @quen_tin "b + b + ... + b (a times)". But yes, every analogy has its limits.
    – user2953
    Commented Sep 12, 2015 at 20:42
10

The the set {¬,∨,∧,⇒,⇔} is usually used because it includes the "most natural" ones.

If we start with a minimal set, like {¬,⇒}, the other ones are usually introduced as abbreviations.

There is no "deep" reason : mainly tradition, and a "reasonable" trade-off between savings (minimality) and readibility (to express p ∧ q as ¬(p ⇒ ¬q) is not so "natural".

4

gnasher729 raised an important point that deserves some expansion: "In formal logic, implication x ⇒ y and equivalence x ⇔ y are very obviously useful - they directly express the possibly most important concepts of formal logic."

The main point that I want to bring up is this: Naive set theory is not the only important set theory, and Classical logic is not the only important logic. In most logics, it isn't true that P⇒Q is the same thing as ¬P∨Q. As you get more into logic, you'll gradually realise that implication is (in a very deep sense) "more fundamental" than even conjunction or disjunction.

But I can give you a taste of that now. You don't need the other connectives if you have quantification over propositions and implication. We will use the usual convention that implication associates to the right (that is, A⇒B⇒C means A⇒(B⇒C); you can reason about this by noting that A⇒B⇒C is equivalent to (A∧B)⇒C). Then you can define them in terms of Gentzen rules:

  • P∧Q is equivalent to ∀S.(P⇒Q⇒S)⇒S
  • P∨Q is equivalent to ∀S.(P⇒S)⇒(Q⇒S)⇒S
  • ¬P is equivalent to ∀S.P⇒S

You may recognise that last one as ex falso quodlibet.

Quantification over propositions seems like something "new", but actually it isn't. In fact, when you said that "P⇒Q could be written ¬P∨Q", you're actually making a claim that some statement is true for all propositions P and Q, which is universal quantification over propositions. We tend to hide that detail from undergraduates so that they can get familiar with propositional logic without having to think about quantification.

Final thought: The "deep sense" which I alluded to earlier is that logical entailment is a morphism in category theory, and implication is an exponential object. I don't expect you to understand that sentence, but an example may help.

Consider modus ponens. If we have P⇒Q and we have P, then we can conclude Q. However in set theory, if we have a function f : A → B and a value x : A, then we can apply the function to the value, giving f(x) : B.

This, it turns out, is not just a trick of notation, or a coincidence. It's a connection which goes very deep.

2

Outside of very foundational logical contexts, there's really not much to gain by starting from a minimal adequate set of connectives and defining the rest. All it does is unduly complicate things. Even in logic, we could start with the Sheffer stroke instead of {~, ->}, but prefer not to unduly confuse our undergraduates.

2

There are two possible logical functions with no inputs, "true" and "false". We don't use symbols for them, just names.

There are four possible logical functions with one input: "True" and "false" (whatever the input is, the output is "true", or whatever the input is, the output is "false"), identity (output = input) and negation (output = opposite of input). Three of these don't require a symbol. For the fourth one we use the symbol ¬. That's one of your five symbols, and we'd really want a symbol for that function.

There are 16 possible logical functions with two inputs. Six of those don't actually "use" both inputs; if we call the inputs x and y then these six functions are "false", "true", x, y, ¬x and ¬y. So there are 10 functions left.

In formal logic, implication x ⇒ y and equivalence x ⇔ y are very obviously useful - they directly express the possibly most important concepts of formal logic. In other areas involved with logic they are much less important, but they are extremly important in formal logic. Having a symbol for "a implies b" we don't really need one for "a is implied by b", we can just write "b implies a". (You might use a right to left arrow; some would say this isn't actually a different symbol). If we add the negations (no extra symbol for "a is not equivalent to b" and "a doesn't imply b") then another six functions are covered, four are left.

For the last four, several reasonable ways to represent them with symbols are possible. We have a symbol for logical and ("both") and logical or ("at least one", or "one or both") and together with their negations everything is covered. We could have used "none", "at most one", but logical and and logical or have won the competition. Add their negations, and everything is covered.

That's formal logic; other areas using logic have other demands. In decision making (often used in software development), logical and, logical or, and negation are most widely used. Implication and equivalence are rarely used. In computer hardware, nand (not (a and b)) and nor (not (a or b)) are quite naturally implemented by the most simple computer hardware, and everything is based on these two functions. However, this is an area that isn't meant for human consumption, unlike formal logic.

2
  • 1
    Actually, we do have symbols for true and false: ⊤ and ⊥, respectively.
    – user2953
    Commented Sep 7, 2015 at 21:23
  • But we don't need them. True and False do just fine. These symbols do nothing but put up an artificial barrier to understanding.
    – gnasher729
    Commented Sep 8, 2015 at 20:34
2

Allow me to give you a completely different reason, why, at least, introducing ∨,∧,⇒ separately may have its merits.

Classical logic is nice and all, but some people actually do care about intuitionistic logic. In intuitionistic logic you cannot define any of the connectives ¬,∨,∧,⇒ in terms of any two others. This is because, many familiar laws like (one of) the deMorgan laws or ¬A∨B ⇔ (A ⇒ B) fail to hold. There is actually a type of semantics for intuitionistic (propositional) logic given by bicartesian closed categories. From this standpoint, this "issue" is completely natural:

In these bicartesian closed categories we have binary products of propositions given by "∧", binary coproducts given by "∨" and exponentials given by "⇒". You might say, that I'm still missing the ¬. Bicartesian closed categories have an initial object. It corresponds to the symbol ⊥ for "falsehood" or "contradiction". The negation of a proposition A is then just A ⇒ ⊥ (this works in classical logic too). There is no reason to believe (and we know it's not possible), that these constructions can be reduced to each other (It doesn't work in familiar categories, like the categories of sets and functions, either). So, if you care about category theory or intuitionistic logic you would naturally introduce ⊥, ∧, ∨ and ⇒ seperately (actually I'm missing the terminal object ⊤, which denotes "truth", but it's not really important here).

Given this, it also doesn't make sense to define "all possible binary connectives" as there are no truth values in intuitionistic logic and so there are infinitely many possible connectives (and not just 16). However, the connectives ∧, ∨ and ⇒ are, you could say, special, since each one satisfies a rather basic universal mapping property (this can partly be seen by the familiar "elimination" and "introduction" rules in "Natural Deduction").

2
  • Hi Stefan, thanks for your answer. The question however is why use only those 5 symbols in your language and stop there - why not define all 16? Commented Sep 8, 2015 at 15:10
  • @EthanAlvaree I extended my answer, don't know whether it helps though. The details are a little bit fuzzy, I'm no expert after all. Commented Sep 8, 2015 at 17:17
-2

Mathematicians are lazy... So lazy than some use computers... But computers need to be simple to be implemented and miniaturized electronically... Then, like CISC .vs. RISC, only negation, conjonction (and) and disjonction (inclusive-or), or better NAND (not-and) and NOR (not-or) are required to express any logical sets (see Karnaugh solving, Church-Turing theorem, ASIC programming)

Then,

use high level presentations with many operators to make it look like short then apparently simple...

use reduced set of operators to run it easily and quickly

There are plenty 'compilers' to convert...

1
  • 3
    I don't think it's likely that formal logic has its origins in processor architecture.
    – user2953
    Commented Sep 7, 2015 at 11:56

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .