3
$\begingroup$

Let $(A,<)$ be a nonempty linearly ordered set, and let $\operatorname{Seq}(A)$ denote the set of finite sequences of elements of $A$. That is, $f\in\operatorname{Seq}(A)$ is a function $f:n\to A$, where $n=\{0,1,\ldots,n-1\}$, $n\in\mathbb{N}$.

The shortlex (also called length-lexicographic) ordering $\prec$ on $\operatorname{Seq}(A)$ is the same as the lexicographic ordering, except that shorter sequences precede their extensions. That is, let $f=\langle f_{0},\ldots,f_{m-1}\rangle$ and $g=\langle g_{0},\ldots,g_{n-1}\rangle$. Then,

$f\prec g$ if and only if there exists $k<n$ such that either ($f_{i}=g_{i}$ for all $i<k$ and $f_{k}<g_{k}$) or ($f_{k}$ is undefined).

That is, $f\prec g$ if and only if either $m<n$ or $m=n$ and there exists $k<n$ such that $f_{i}=g_{i}$ for all $i<k$ and $f_{k}<g_{k}$

I can show that $(\operatorname{Seq},\prec)$ is a linearly ordered set.

If $(A,<)$ is well-ordered, I am trying to show that $(\operatorname{Seq},\prec)$ is also well-ordered. But I cannot complete my argument, which proceeds as follows:

Let $X\subseteq\operatorname{Seq}(A)$ be nonempty. Then, it is reasonable to try to find the least element of $X$ among the shortest sequences it contains. The set $$\{n\in\mathbb{N}:\text{there exists $f\in X$ having length $n$}\}$$ is nonempty, and so it has a least element, say $l\in\mathbb{N}$.

Let $X_{l}$ denote the set of all sequences of $X$ having length $l$ (which is nonempty since $X$ is nonempty). It is clear from the definition of $\prec$ that the least element of $X_{l}$ (if it exists) will be the least element of $X$.

Define a sequence $s$ recursively as follows:

  1. $s_{0}=\text{least $a\in A$ such that there exists $f\in X_{l}$ with $f_{0}=a$}$. This is well-defined, since we know the set $B_{0}=\{a\in A:\text{there exists $f\in X_{l}$ with $f_{0}=a$}\}$ is a nonempty subset of $A$ (since $X_{l}$ is nonempty), so it has a least element.
  2. $s_{i+1}=\text{least $a\in A$ such that there exists $f\in X_{l}$ with $f_{i+1}=a$ and $f_{j}=s_{j}$ for all $j\leq i$}$ (if such $a$ exists)

The sequence $s$ thus obtained is undefined for $i\geq l$, so it is a finite sequence $\langle s_{0},\ldots,s_{l-1}\rangle$ of length $l$.

Question 1: How do we know $s\in X_{l}$? The set $B_{0}$ is nonempty, so there exists $f\in X_{l}$ with $f_{0}=s_{0}$. This seems like the base case of a proof by induction (on $l$) that there will be an $f\in X_{l}$ with $f_{i}=s_{i}$ for all $i<l$. Am I heading in the right direction?

Question 2: What version of the Recursion Theorem fits the pattern of the recursive definition of $s$ I gave? The two versions of the Recursion Theorem I am aware of are:

First version:

For any set $A$, any $a\in A$ and any function $g:A\times\mathbb{N}\to A$, there exists a unique infinite sequence $f:\mathbb{N}\to A$ such that

  1. $f_{0}=a$;
  2. $f_{i+1}=g(f_{i},i)$ for all $i\in\mathbb{N}$.

Second version:

For any set $A$ and any function $g:\operatorname{Seq}(A)\to A$ there exists a unique sequence $f:\mathbb{N}\to A$ such that $$f_{n}=g(\langle f_{0},\ldots,f_{n-1}\rangle)\;\text{ for all $n\in\mathbb{N}$}$$ where $f_{0}=g(\langle\rangle)=g(\emptyset)$.

This is related.

$\endgroup$
0

2 Answers 2

1
$\begingroup$

I have decided to post a more elaborate version of the edits I made to my question trying to answer it.

Regarding Question 1: Consider the property $\mathbf{P}(r)$: "there exists $f\in X_{l}$ such that $f_{j}=s_{j}$ for all $j\leq r$".

  1. $\mathbf{P}(0)$ holds because $B_{0}$ is nonempty.
  2. Suppose $r<l$ and $\mathbf{P}(r)$ holds. We have to show $\mathbf{P}(r+1)$ holds. Indeed, the induction hypothesis tells us there is $f\in X_{l}$ with $$f=\langle s_{0},\ldots,s_{r},a_{r+1},\ldots,a_{l-1}\rangle$$ for some $a_{r+1},\ldots a_{l-1}\in A$. Therefore, the set $$B_{r+1}=\{a\in A:\text{there exists $f\in X_{l}$ with $f_{r+1}=a$ and $f_{j}=s_{j}$ for all $j\leq r$}\}$$ is nonempty, so it has a least element $s_{r+1}$. Therefore, there is $f\in X_{l}$ with $f_{j}=s_{j}$ for all $j\leq r+1$. So $\mathbf{P}(r+1)$ holds.

The Finite Induction Principle now asserts that $\mathbf{P}(r)$ holds for all $r\leq l$. That is, there is $f\in X_{l}$ with $f=s$.

Is my proof answering question 1 correct? Is there a better way to show that $s\in X_{l}$?


Regarding Question 2: As I mention in the comments to Darsen's answer, the recursive definition of $s$ is justified only if we can show that such a function exists and that there do not exist two or more of such functions. We can do so by induction on $l$, as Darsen's answer and comments seem to indicate, or we can have at hand a more general Recursion Theorem which does it for us. A modification of the second version of the Recursion Theorem I stated in the question does this:

For any set $A$ and any function $g$ on a subset of $\operatorname{Seq}(A)$ into $A$, there exists a unique sequence of elements of $A$ such that

  1. $f_{n}=g(\langle f_{0},\ldots,f_{n-1}\rangle)$ for all $n\in\mathbb{N}$ such that $n\in\operatorname{dom} f$; where $f_{0}=g(\langle\rangle)=g(\emptyset)$.
  2. $f$ is either an infinite sequence (i.e. with domain $\mathbb{N}$) or $f$ is a finite sequence of length $p$ and $g(\langle f_{0},\ldots,f_{p-1}\rangle)$ is undefined.

Proof: Let $\overline{S}=S\cup\{\overline{s}\}$, where $\overline{s}\not\in S$. Define $\overline{g}:\operatorname{Seq}(\overline{S})\to\overline{S}$ by $$\overline{g}(\langle t_{0},\ldots,t_{n-1}\rangle)=\begin{cases} g(\langle t_{0},\ldots,t_{n-1}\rangle)& \text{if defined}\\ \overline{s}& \text{otherwise}. \end{cases}$$ Then we use the second version of the Recursion Theorem I sated in the question to obtain the corresponding infinite sequence $\overline{f}:\mathbb{N}\to\overline{S}$. If $\overline{f}_{l}=\overline{s}$ for some $l\in\mathbb{N}$, we consider $\overline{f}\upharpoonright l$ for the least such $l$.

The definition of the sequence $s$ in the original question fits this version of the Recursion Theorem. The definition of the sequence $\langle y_{i}\rangle$ in the first comment to Daren's answer (used to show that a subset of a finite set is finite) also fits the pattern of this theorem.

$\endgroup$
0
$\begingroup$

Your proof of Question 1 is correct, but you can just use $B_{r+1}$ when defining $s$ instead.

The thing is you don't need the Recursion Theorem. You're trying to define $s$ on $l$, not on $\mathbb N$, so you can do it step by step, since $l$ is finite. Then, if you define $s$ using $B_0$ and $B_{r+1}$ to justify each term, it is clear that $s\in X_l$.

$\endgroup$
8
  • $\begingroup$ Yes, but there is a version of recursion underlying the process. Much like when we prove that if $X$ is a finite set, $X=\{x_{0},\ldots,x_{n-1}\}$ and $Y\subseteq X$, then $Y$ is finite: let $k_{0}=\text{least $k<n$ such that $x_{k}\in Y$}$ and $k_{i+1}=\text{least $k>k_{i}$, $k<n$ such that $x_{k}\in Y$}$ (if such $k$ exists). Set $y_{i}=x_{k_{i}}$. This recursive definition fits the pattern of a modified version of the first Recursion Theorem I stated above. The recursive definition of $s$ in the question seems to fit a modified version of the second version of the Recursion Theorem above. $\endgroup$
    – John
    Commented Jun 26 at 20:34
  • $\begingroup$ The recursive definition I stated in the previous comment fits the pattern of the following version of the Recursion Theorem: For any set $A$, any function $g$ on a subset of $A\times\mathbb{N}$ into $A$, and any $a\in A$, there is a unique sequence $f$ of elements of $A$ such that: $f_{0}=a$; $f_{n+1}=g(f_{n},n)$ for all $n\in\mathbb{N}$ with $n+1\in\operatorname{dom} f$; $f$ is either infinite (i.e. with domain $\mathbb{N}$) or $f$ is finite of length $p+1$ with $g(f_{p},p)$ undefined. $\endgroup$
    – John
    Commented Jun 26 at 20:50
  • $\begingroup$ You do construct $s$ recursively, but you don't need the full power of a recursion theorem to do it, since you want to define $s$ on $l$, so the process is finite. If you weren't able to construct $s$ and define $s_k$ for all $k<l$, then taking the least such $k$ would give a contradiction (this is basically induction on $l$). $\endgroup$
    – Darsen
    Commented Jun 27 at 23:20
  • $\begingroup$ I see. But $s$ is defined implicitly on $l$ as satisfying some required conditions, and such a definition justified only if we can show that such a function exists and that there do not exist two or more of such functions. We can do so by induction on $l$ (I think this is the approach you are pointing to) or we can have at hand a more general Recursion Theorem which does it for us. The same two ways apply as well to the example of proving that a subset of a finite set is finite (first two comments above) Are my perceptions of the matters at hand correct? $\endgroup$
    – John
    Commented Jun 28 at 10:15
  • $\begingroup$ Without a Recursion Theorem we would have to show that $s$ is a well-defined unique sequence for all $l\in\mathbb{N}$, for example, by induction on $l$. Then we would have to show $s\in X_{l}$ for an arbitrary $l$, by finite induction less than $l$, for example. But with a Recursion Theorem at hand, we know that $s$ is already well-defined and unique. We only need to show the part $s\in X_{l}$. $\endgroup$
    – John
    Commented Jun 28 at 10:23

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .