2
$\begingroup$

The Cramér random model for the primes is a random subset ${{\mathcal P}}$ of the natural numbers with ${1 \not \in {\mathcal P}}, {2 \in {\mathcal P}}$, and the events ${n \in {\mathcal P}}$ for ${n=3,4,\dots}$ being jointly independent with ${{\bf P}(n \in {\mathcal P}) = \frac{1}{\log n}}$.

(Probabilistic Goldbach conjecture) Show that almost surely, all but finitely many natural numbers ${n}$ are expressible as the sum of two elements of ${{\mathcal P}}$.

Question: Consider the r.v $S_x := \sum_{3 \leq n \leq x} 1_{\mathcal{P}}(n)1_{\mathcal{P}}(x-n)$, which we interpret as the number of ways $x$ (taken to be a natural number) is expressible as the sum of two Cramér random model primes, scaled by a factor of two. One can deduce as in this paper https://arxiv.org/pdf/1508.05702 that ${\bf E}S_x \sim x / \log^2 x$, thus it seems natural to show a strong concentration of the quantity $S_x$ around its mean (which tends to infinity). However, without using Chernoff bounds, I'm not clear about how to establish this (having tried to calculate the $k$th moment ${\bf E}(S_x - {\bf E}S_x)^k$ for $k = 2, 4, etc$. Or should one use the Borel-Cantelli lemma to show that ${\bf P}(S_x = 0)$ is summable in $x$?

Edit: An attempted solution by the moment method is posted below, verification are welcomed.

$\endgroup$

2 Answers 2

1
$\begingroup$

For the concentration of $S_x$ around its expected value $\mathbb{E}[S_x] \sim \frac{x}{\log^2 x}$ in the Cramér random model for primes, consider applying Chebyshev's inequality for simplicity, given an estimation of the variance of $S_x$. If $S_x$ approximates a Poisson distribution, then the Borel-Cantelli lemma can be invoked by analyzing $P(S_x = 0) = e^{-\lambda_x}$ where $\lambda_x = \frac{x}{\log^2 x}$. If $\sum_{x=3}^\infty e^{-\lambda_x}$ converges, it indicates that almost surely, all but finitely many natural numbers are expressible as the sum of two primes from $P$.

I hope this is a strong enough proof of a strong concentration of the quantity $S_x$.

$\endgroup$
0
0
$\begingroup$

Take the r.v $\sum_{3 \leq n \leq x} 1_{\mathcal{P}}(n)1_{\mathcal{P}}(x-n)$, which we interpret as the number of ways $x$ (taken to be a natural number) is expressible as the sum of two Cramér random model primes, scaled by a factor of two. Consider only integers greater than $2$ in ${\mathcal P}$, by independence:

$\displaystyle {\bf E}S_x = \sum_{3 \leq n \leq x-3}\frac{1}{\log n \log (x-n)} \sim 2 \int_{3}^{x/2} \frac{dt}{\log t \log (x-t)} := 2I_x$.

Moreover, since $\displaystyle \frac{2}{\log (x-3)} \int_{3}^{x/2} \frac{dt}{\log t} \leq 2I_x \leq \frac{2}{\log x - \log 2} \int_{3}^{x/2} \frac{dt}{\log t}$,

one may deduce from $\displaystyle \int_{3}^x \frac{dt}{\log t} \sim x / \log x$ that $\displaystyle {\bf E}S_x \sim x / \log^2 x$.

We normalise each $X_n = 1_{\mathcal P}(n)1_{\mathcal P}(x - n)$ to have mean zero by replacing it with $Y_n := \displaystyle 1_{\mathcal P}(n)1_{\mathcal P}(x - n) - \frac{1}{\log n \log (x-n)}$, so that $S_x$ also gets replaced by $\displaystyle S_x' = \sum_{3 \leq n \leq x-3} Y_n$. As the $Y_n$ are independent, have mean zero and of size $O(1)$, one can calculate the fourth moment ${\bf E}(\sum_{3 \leq n \leq x-3} Y_n)^4 = O(x^2)$. By Markov's inequality, $\forall \varepsilon > 0$, this implies

$\displaystyle {\bf P}(|\frac{S_x'}{x / \log^2 x}| > \varepsilon) \leq \frac{\log^8 x O(x^2)}{\varepsilon^4 x^4} = O(\frac{\log^8 x}{x^2})$,

which are summable in $x$ by the comparison test. By the Borel-Cantelli lemma, $\displaystyle \frac{S_x'}{x / \log^2 x}$ thus converges almost surely to zero. Equivalently, we have

$\displaystyle {\bf P}(\lim_{x \to \infty} \frac{S_x}{x / \log^2 x} = 1) = 1$. i.e, $S_x \stackrel{\text{a.s}}{\sim} x / \log^2 x$, and the desired claim follows.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .