I have a list (lets call it $ \{L_N\} $) of N random numbers $R\in(0,1)$ (chosen from a uniform distribution). Next, I roll another random number from the same distribution (let's call this number "b"). Now I find the element in the list $ \{L_N\} $ that is the closest to the number "b" and find this distance.
If I repeat this process, I can plot the distribution of distances that are obtained through this process.
When $N\to \infty$, what does this distribution approach?
When I simulate this in Mathematica, it appears as though it approaches an exponential function. And if the list was 1 element long, then I believe this would exactly follow an exponential distribution.
Looking at the wikipedia for exponential distributions, I can see that there is some discussion on the topic:
But I'm having trouble interpreting what they are saying here. What is "k" here? Is my case what they are describing here in the limit where $n\to \infty$?
EDIT: After a very helpful helpful intuitive answer by Bayequentist, I understand now that the behavior as $N \to \infty$ should approach a dirac delta function. But I'd still like to understand why my data (which is like the minimum of a bunch of exponential distributions), appears to also be exponential. And is there a way that I can figure out what this distribution is exactly (for large but finite N)?
Here is a picture of what the such a distribution looks like for large but finite N:
EDIT2: Here's some python code to simulate these distributions:
%matplotlib inline
import math
import numpy as np
import matplotlib as mpl
import matplotlib.pyplot as plt
numpoints = 10000
NBINS = 1000
randarray1 = np.random.random_sample((numpoints,))
randarray2 = np.random.random_sample((numpoints,))
dtbin = []
for i in range(len(t1)):
dt = 10000000
for j in range(len(t2)):
delta = t1[i]-t2[j]
if abs(delta) < abs(dt):
dt = delta
dtbin.append(dt)
plt.figure()
plt.hist(dtbin, bins = NBINS)
plt.show()