I've always assumed on faulty intuition that if you have an event which occurs 1 in n chances, it will be super likely to happen at some point of that event occuring n times. However, given some analysis, it doesn't actually seem to be all that super likely, and seems to converge at a particular value as the value of n rises. That value is about 0.63212.
Is this correct? If so, is there a name for this value and is it considered significant within the field of probability?
Below is the Python code that I used to arrive at this value.
>>> def p(x, r):
... return x + r * (1.0 - x)
>>> def p_of_1(r):
... x = r
... while True:
... yield x
... x = p(x, r)
>>> def p_of_n(n):
... g = p_of_1(1.0 / n)
... return [next(g) for x in range(n)]
...
>>> p_of_n(1)
[1.0]
>>> p_of_n(2)
[0.5, 0.75]
>>> p_of_n(3)
[0.3333333333333333, 0.5555555555555556, 0.7037037037037037]
>>> p_of_n(4)
[0.25, 0.4375, 0.578125, 0.68359375]
>>> p_of_n(5)
[0.2, 0.36000000000000004, 0.488, 0.5904, 0.67232]
>>> p_of_n(6)[-1]
0.6651020233196159
>>> p_of_n(10)[-1]
0.6513215599000001
>>> p_of_n(100)[-1]
0.6339676587267709
>>> p_of_n(10000)[-1]
0.6321389535670703
>>> p_of_n(10000000)[-1]
0.6321205772225762