1
$\begingroup$

I am not a mathematician but an engineer, so I can read some basics of the language proofs are written in. Second I am bad in probability and infinity and my question covers both. So I like to apologize for not using a mathematical language here:

Suppose you have an urn with as much black as white balls of infinite amount. Could such a set be defined without any logical conflicts?

Now start drawing a random amount of balls between 0 and infinity. Put the balls in order as you draw them and think of them as white being a zero and black being a 1. The order and the color will now represent a binary number.

If I repeat this process for an infinite number of times, will this produce the set of natural numbers? Could this be proofed?

$\endgroup$
5
  • 1
    $\begingroup$ I'm not giving an answer, but there's something that should be clarified: what does it mean for the urn to have as many white balls as it has black, provided there aren't a finite number of them? The rigorous way of defining that is by saying there is a bijection between the set of white balls and the set of black balls. You can read about it under case 1 of the comparing sets tab in this wikipedia page. $\endgroup$
    – Git Gud
    Commented Mar 29, 2013 at 11:04
  • 2
    $\begingroup$ Perhaps you can avoid using the complications of infinity by assuming you are tossing a fair coin, as many times as you want. Would this be different from the outcome you're trying to simulate? $\endgroup$
    – Macavity
    Commented Mar 29, 2013 at 11:15
  • $\begingroup$ @Macavity: An excellent suggestion! $\endgroup$ Commented Mar 29, 2013 at 14:58
  • $\begingroup$ I don't know what you mean by "creating" the set of natural numbers, but you can extract a subset identical in structure to the natural numbers (as given by the Peano Axioms) from any set $s$ on which an injective function $f$ is defined, and in which at least one element $x_0$ has no pre-image under $f$. The $f$ serves as a successor function, and the $x_0$ as the first element. $\endgroup$ Commented Mar 31, 2013 at 2:21
  • $\begingroup$ Sorry not expressing this more clearly. I had something in mind like Bertand Russel did: Take Nothing. Create a set from nothing. How many elements does this set have? 0. So take 0 into your set. How many elements do we have? 1 and so on. So my idea was, to do this from randomness which could be expressed by white noise for example. So my urn or the coin would be to look at the voltage level of noise at random times and create the set of N by this. $\endgroup$ Commented Apr 2, 2013 at 5:39

2 Answers 2

3
$\begingroup$

Firstly, let me address your first question. Consider the following experiment:

There is an urn with one black and one white ball. We draw a ball, write down $0$ if it's white, $1$ if it's black, then put the ball back.

Repeating this experiment is clearly equivalent to taking balls from the urn in your experiment. This hopefully settles your concerns; otherwise, the concept of a multiset may interest you (see Wikipedia; it can be generalised to infinitely many copies of the same element).


The next concern is a means to decide on the number of balls we're going to take from the urn (or, how many times the experiment above is carried out). We can't simply say "We'll take any natural number, all with equal probability" because this probability would become $0$ (suppose it were some $a >0$; then the probability of selecting a number wouldn't be $1$, but $\sum\limits_{n \in \Bbb N} a = +\infty$).

Therefore a different probability distribution has to be used, e.g. a geometric distribution (cf. Wikipedia).

For reasonable choices of distribution, there will be a nonzero chance of generating any number (because each number $n$ has a binary expansion of certain length $k$; if $k$ can be selected by our distribution, we can get $n$ by a proper selection of black and white balls).

Presuming that you mean by "repeating an infinite number of times" the limit of drawing finitely many times (otherwise it's rather hard to make mathematical sense of this statement) we will in general end up with the notion of almost surely, meaning that it occurs with probability $1$ (which isn't the same as that it necessarily occurs; see again Wikipedia).

The only result I could presently come up with is (assuming reasonable distribution in the sense above):

Let $n$ be a natural number. Then $n$ will be selected almost surely.

This is however not equivalent to saying that we will almost surely obtain all of $\Bbb N$. An analogy: If we were to generate an infinite bit-string, with $0,1$ equiprobable (both occurring with $p = \frac12$) then we would almost surely not end up with $000\ldots$, or for that matter, any specific bitstring. But we would end up with some bitstring.

A more rigorous development of these ideas leads into the realms of measure theory. I hope this addresses most of your questions.


Addendum: As Henning Makholm pointed out in a comment, we actually have:

All of $\Bbb N$ will be generated almost surely.

This is because "All of $\Bbb N$ is generated" is the complement of the event:

Some $n$ is not generated.

whose probability is (bounded above by — ignoring the case of two or more omitted integers) $\sum\limits_{n \in \Bbb N} \mathsf{Pr}(\text{$n$ is not generated}) = \sum\limits_{n \in \Bbb N} 0 = 0$.

$\endgroup$
11
  • $\begingroup$ Thank you for pointing out to me that the numbers won't be even distributed. That is equivalent to what I learned from below that no uniform distribution can be defined on a countable set. It's hard to imagine that 0, 00, 000, 00000, ... all lead to 0 thus 0 seems to be clearly more probable if the sequence is finite. If it is infinite the opposite seems true. But at least I now have a feeling for numerous wrong assumptions I made. $\endgroup$ Commented Mar 29, 2013 at 12:26
  • $\begingroup$ Actually, all of $\mathbb N$ will be produced almost surely. In order not to produce all of $\mathbb N$ we would need a sequence where either 0 is never produced or 1 is never produced or 2 is never produced ..., and the probability of this alternative happening is at most $\sum_{n\in\mathbb N} P(n\text{ is never produced})$. But that is a countable sum of zeroes, which is zero. $\endgroup$ Commented Mar 29, 2013 at 22:28
  • $\begingroup$ @HenningMakholm I highly suspected that to be the case, but as I didn't see the argument at the time I was reluctant to state it. Thanks; I've amended my answer to include this observation. $\endgroup$
    – Lord_Farin
    Commented Mar 29, 2013 at 22:49
  • $\begingroup$ I am now now a little bit uncertain. I have to statements: Let n be a natural number. Then n will be selected almost surely. and: All of N will be generated almost surely. So I could say: my set S is almost surely equal to N. I've learned that equality is something mathematicians are very precocious about: So could I also say: S = N ? Sorry again for asking such basic questions. $\endgroup$ Commented Apr 1, 2013 at 12:02
  • $\begingroup$ Since these basic questions commonly lead to misconceptions, I'm happy to answer them. For this one, it really depends on the context: for practical purposes it likely won't be harmful, but formally it should be avoided. Many "nice" properties of equality carry over to $S = \Bbb N \text{ a.s.}$ ("$S$ equals $\Bbb N$ almost surely"), but not all. IMHO it's best to stay on the safe side here; analogous would be stating $\Bbb R = \Bbb R \setminus \Bbb Q$ which you hopefully disagree with. $\endgroup$
    – Lord_Farin
    Commented Apr 1, 2013 at 16:38
3
$\begingroup$

Unfortunately the process goes wrong in a couple of places.

The first place it goes wrong is when you say 'draw randomly'. From your knowledge of the uniform distribution on finite sets (i.e. you pick any one ball with the same probability as any other), it should seem intuitively obvious what this means in this context... but unfortunately it breaks down: you can't define a uniform probability distribution on a countable set. If you did, and the probability that any given ball is drawn with probability $p$, then you'd need $$p+p+p+\cdots = 1$$ but if $p=0$ then $p+p+p+\cdots = 0$ and if $p>0$ then $p+p+p+\cdots = \infty$, so this can't happen.

The second place it goes wrong is that an infinite binary string usually won't represent a natural number: for example, the infinite sequence of $1$s, $11111\dots$, isn't a natural number.

$\endgroup$
8
  • $\begingroup$ Thank you for pointing out for me that this process will create non natural numbers. Oh infinity is such an ugly thing. Further I'll keep in mind for the future that a uniform distribution can't be defined on a countable set. $\endgroup$ Commented Mar 29, 2013 at 12:19
  • $\begingroup$ My guessing still hangs to your answer, but the argument above seems to be strong. I've learned that in many cases the probability can go above one and that in these cases you have to look at the counter-probability 1-p. Is it possible to just add the probabilities? I draw 1 to infinite times infinite times so this gives me some kind of area, surface, plane or field(pick whatever would be the appropriate). This field is clearly countable, but does it cover the whole N? $\endgroup$ Commented Apr 1, 2013 at 12:16
  • $\begingroup$ After thinking about 1111... I found out that it is a natural number. Let x = 1111...111 Then 10*x= 111...1110. Subtract the two leaves you with 1000....0000 (one less, don't know any notation for this). Now this is clearly a power of two and thus a natural number. $\endgroup$ Commented Apr 2, 2013 at 5:32
  • $\begingroup$ @JohannesFrank: That's false. A finite string of digits $d_1d_2 \dots d_n$ denotes $$d_1 \cdot 2^{n-1} + d_2 \cdot 2^{n-2} + \cdots + d_{n-1} \cdot 2 + d_n$$ But here you don't have a finite string and this interpretation of the notation doesn't make sense (because you have a 'starting point' and a 'stopping point' with infinitely many things in-between... if you really had an infinite string then there'd be no stopping point). But even supposing the notation made sense, you'd be left with $2^{\infty}$. $\endgroup$ Commented Apr 2, 2013 at 8:56
  • $\begingroup$ Regarding your previous post, you need to make precise what you're saying. If you have a countable anything then you can't put a uniform distribution on it. The 'area' of $\mathbb{Q}^2$ is zero, for example. $\endgroup$ Commented Apr 2, 2013 at 8:58

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .