Start with an integer like n = 100 and set it equal to a uniformaly random integer between [0,n] inclusive. Keep cutting it this way until n = 0. What's the expected value of the number of cuts needed?
For me, intuition gives an expected value of $\log_2 n ≈ 6.64$, but empirical simulation in Python:
import random
cuts = 0
expectedValue = 0
trials = 100000
for i in range(trials):
startingValue = 100
while startingValue > 0:
startingValue = random.randint(0, startingValue)
cuts += 1
expectedValue = cuts / trials
print(expectedValue)
results in $≈6.18$.
Does there exist an explicit solution for n = 100 or for any integer n?