7
$\begingroup$

It's well-known that sets are "isomorphic" to logic: if we treat $\varphi(A_1, A_2)$ as a shorthand for $\forall x: \varphi(x \in A_1, x \in A_2)$ then $A \land B \equiv A \cap B$ and $A \rightarrow B \equiv A \subseteq B$ and so on.

I've noticed that a large number of true logical statements become events with probability 1 when interpreted probabilistically. For example, if $A \subseteq B$ ($\equiv A \rightarrow B$) then $\mathbb{P}(B|A) = 1$. If you squint hard enough you should see modus ponens there.

To connect a Boolean algebra with a Boolean ring you set $x \lor y := x + y - xy$, and wouldn't you know it, $\mathbb{P}(A \cup B) = \mathbb{P}(A) + \mathbb{P}(B) - \mathbb{P}(A \cap B)$. That connection can't just (ahem) be a random event, can it? ;-)

If we combine some propositional calculus and/or a Boolean algebra with measure/probability theory, can we get some theorems for free? Is it e.g. the case that if $\varphi$ is some tautology then the set-theoretic interpretation of $\varphi$ always has probability 1? Is there something stronger that's also true?

I also notice that $\mathbf{0}$ and $\mathbf{1}$, by which I mean the empty set and the set of all outcomes, are independent from all other events, and that I run into problems with Huntington's equation when I set $\lnot x := 1 - x$ and try to make a Boolean algebra over $[0, 1] \subseteq \mathbb{R}$, to do particularly with higher-order terms.

What are the theorems I'm grasping at but not quite seeing?

$\endgroup$
4
  • $\begingroup$ @MaximilianJanisch IOW, there are non-empty null sets? Sure. I don't follow how we could have $ω ∈ Ω$ and $ω ∈ A$ and $A \subseteq B$ but not $ω ∈ B$? Or am I misunderstanding you? If your warning still stands, maybe we have $\mathbb{P}(tautology) = 1$ but not $\mathbb{P}(φ) = 1 \implies \textrm{φ is a tautology}$ (or something like it)? $\endgroup$ Commented Mar 1, 2019 at 22:36
  • 1
    $\begingroup$ Of course (by definition) $(\omega \in A)\land (A\subset B) \implies \omega \in B$. I was more concerned (as you mentioned) about $\mathbb{P}(X) = 1 \rlap{\quad /}{\implies} \textrm{X is a tautology}$ $\endgroup$ Commented Mar 1, 2019 at 22:39
  • $\begingroup$ I guess the counterexample writes itself, or else you did :P — let $\mathbb{P}(A) = 1$ with $B := Ω \setminus A ≠ ∅$, then $A$ is refuted by every member of $B$. But can I e.g. plug-and-chug the Russell-Bernays axioms and get endless theorems for free? (en.wikipedia.org/wiki/…) $\endgroup$ Commented Mar 1, 2019 at 22:50
  • 1
    $\begingroup$ Two comments: The logical formula $A\to B$ as $\lnot A\lor B$, corresponds to the set $A^\complement\cup B$. However, for the model or proof theoretic consequence, the statement $A\models B$ (or $A\vdash B$) indeed corresponds to $A\subseteq B$. Second, the addition in Boolean rings is totally different from that of real numbers, e.g. $x-y=x+y$ holds. In my opinion, it's hard to find a link for your two equations. $\endgroup$
    – Berci
    Commented Mar 2, 2019 at 1:58

3 Answers 3

1
$\begingroup$

Every $\sigma$-algebra over some $\Omega$ is a Boolean algebra with $\Omega = 1$. The 1-element of every Boolean algebra is unique. Thus, if $φ$ provable using e.g. the sound and complete Russel-Bernays axioms with the associated deduction rule and we uniformly substitute in members of our $\sigma$-algebra for the variables in φ and replace disjunction/negation with union/complement, the result must be the unique 1-element in our $\sigma$-algebra, i.e. $\Omega$. But $P(\Omega) = 1$ for every probability measure $P$ by definition, so every tautology has probability 1 (for this value of tautology).

It's easy to show that every $\sigma$-algebra is a Boolean algebra by using the Huntington axiomatization (the "fourth set" on page 7 in Huntington's PDF article).

I assume for now that my failure to make a (Boolean) ring homomorphism out of a probability measure is because it doesn't work in general. I'm not sure what to make of the similarity between the union rule for probability and disjunction in Boolean rings. Maybe "$\mathbb{P}$ preserves some of the structure, but at the expense of other parts".

$\endgroup$
1
  • $\begingroup$ I've recently become aware of Cox's theorem (en.wikipedia.org/wiki/Cox%27s_theorem), which I interpret as saying this: if you want to reason with uncertainty in a way that is consistent with the rules of classical Aristotelian logic, your reasoning must obey the laws of probability. (Or rather, consistent with a natural extension of logic, and a few other reasonable criteria.) — Cox's theorem might also be what I was after. $\endgroup$ Commented Jan 26, 2020 at 15:07
0
$\begingroup$

"To connect a Boolean algebra with a Boolean ring you set x∨y:=x+y−xy, and wouldn't you know it, P(A∪B)=P(A)+P(B)−P(A∩B). That connection can't just (ahem) be a random event, can it?"

One can maintain that it kind of is. Setting (x$\lor$ y) := max(x, y) comes as simpler. But, P(AUB) does not equal max(A, B). This also kind of dovetails with a discussion about possibility theory, which uses possibility measures instead of probability measures.

"If we combine some propositional calculus and/or a Boolean algebra with measure/probability theory, can we get some theorems for free?"

Plenty of propositional calculi can't get combined with a Boolean algebra and consistency get maintained. The propositional calculus has to come as a classical one for that to work, and there exist plenty of non-classical propositional calculi. So, it can't be an arbitrary propositional calculus that you select for this sort of thing.

"Is it e.g. the case that if φ is some tautology then the set-theoretic interpretation of φ always has probability 1?"

Are you asking for Cantorian/classical set theory? Because there exist non-Cantorian/non-classical set theories also.

$\endgroup$
0
$\begingroup$

I see that you found out about Cox's theorem. Have you also discovered Jaynes's "Probability Theory: The Logic of Science"? Inspired by Cox, Keynes and others, Jaynes goes straight from logic to probability without going through set theory ala Kolmogorov, but he gets to the same abstract theory. For Jaynes, probability is the rational plausibility of a proposition based on all available information.

Just as expectation $E(\cdot)$ is linear, you can consider probability $P(\cdot)$ as linear and go from a logical identity about propositions $A$ and $B$ to a property of probability.

For example, start with this logical identity $$\bar{A} = 1 - A$$ and get to this property of probability $$P(\bar{A}) = 1 - P(A)$$

Start with $$A \cup B = A + B - AB$$ and get $$P(A \cup B) = P(A) + P(B) - P(AB)$$

Start with \begin{align*} A \cup B \cup C &= 1 - \bar{A}\bar{B}\bar{C}\\ &= 1 - (1-A)(1-B)(1-C)\\ \end{align*} and get Inclusion/Exclusion

In order to conform to standard textbook notation for these properties of probability, I use $\cup$ instead of $\lor$. As you already pointed out, they mean the same thing.

$\endgroup$
5
  • 2
    $\begingroup$ What does “$A\cup B=A+B-AB$” mean? What is ‘$+$’ here? $\endgroup$
    – MJD
    Commented Apr 14, 2023 at 4:40
  • $\begingroup$ This is ordinary algebra, not Boolean algebra. "+" means arithmetic "+". See my other post math.stackexchange.com/q/4678586/1140344 . $A \cup B$ means $A \lor B$ means A or B. $\endgroup$
    – Mkanders
    Commented Apr 14, 2023 at 5:59
  • $\begingroup$ I think I understand now: $A$ and $B$ are numbers between 0 and 1, and you are defining new arithmetic operations on them which you write as “$A\cup B$” etc. The thing you call a "logical identity" is actually a definition. Is that right? $\endgroup$
    – MJD
    Commented Apr 14, 2023 at 11:11
  • $\begingroup$ But if that is correct then your $P$ function isn't merely “linear”, it's the identity function $P(x)=x$, which leaves me wondering what the point of the exercise was. $\endgroup$
    – MJD
    Commented Apr 14, 2023 at 11:14
  • $\begingroup$ $A$ and $B$ are logical propositions that can take on the values 0 or 1. $P(A)$ and $P(B)$ are probabilities that can take on a value between 0 and 1, such as 0.33. This is similar to Blitzstein's "Fundamental Bridge" between expected value and probability, in which the expected value of an indicator variable is the probability of the "event". But, like Jaynes, Keynes, and Boole, my $A$, $B$, $C$, ... represent propositions that are either true or false, not events that either happen or don't. The abstract structure of probability theory is the same either way. $\endgroup$
    – Mkanders
    Commented Apr 14, 2023 at 16:09

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .