Suppose we have a Boolean algebra over $n$ bits, meaning when viewed as a vector space over $F_2$, the dimension is $n$.
Then we can define $n$ unary predicates $P_n$ which return the value of the $n$'th bit in the corresponding bitvector.
In other words, for some arbitrary element $v$ in the algebra, $P_n(v) = 1$ if the $n$'th bit of $v$ is $1$, or $P_n(v) = 0$ if it is $0$. In this case, we take $0$ and $1$ to be the minimal and maximal elements of the Boolean algebra, respectively.
It is clear that given a Boolean algebra and a set of unary predicates defined from it in this fashion, that we get a quantifier-free monadic predicate logic.
Question: Does the converse hold? In other words, given a quantifier-free monadic predicate logic, is it always equivalent to some Boolean algebra in the above fashion, or derived easily from what I've written here?
The tricky thing here is that defined the way we do it above, the predicates appear to distribute over $\land$, $\lor$, and $\lnot$. In other words, we have:
- $P_n(a \land b)$ = $P_n(a) \land P_n(b)$
- $P_n(a \lor b)$ = $P_n(a) \lor P_n(b)$
- $P_n(\lnot a)$ = $\lnot P_n(a)$
Arbitrary unary predicates do not have to obey these laws, but they do hold for the predicates stated above for bits in the Boolean algebra. However, it is relatively easy for us to define "compound predicates" from these primitive ones which exhibit more complex behavior, i.e. defining a predicate $P_x(a) = (\lnot P_1(a) \lor P_2(a)) \land (P_1(a) \lor \lnot P_2(a))$, for instance.