2
$\begingroup$

Consider the series $$\sum_{n=1}^\infty n e^{-n \varepsilon}$$ For $\varepsilon \leq 0$, it diverges. For $\varepsilon > 0$, it converges and equals $$\frac{e^\varepsilon}{(e^\varepsilon - 1)^2}$$ which has the asymptotic expansion $$\frac{1}{\varepsilon^2} - \frac{1}{12} + \frac{\varepsilon^2}{240} - \frac{\varepsilon^4}{6048} + O(\varepsilon^6)$$ which has a constant term of $-\frac{1}{12}$.


Let $p_n$ denote the $n$th prime number. $p_n \sim n \log n$ by the prime number theorem. Let \begin{align} f_1(\varepsilon) &= \sum_{n=1}^\infty p_n e^{-n \varepsilon} \\ f_2(\varepsilon) &= \sum_{n=1}^\infty p_n e^{-p_n \varepsilon}. \end{align} These series converge for all $\varepsilon > 0$. Do they have asymptotic expansions in $\varepsilon$? If so, is it possible to extract their constant terms?

[Related question on MO. For more on smoothed sums, see reference 1.]


References:

  1. Terence Tao. The Euler-Maclaurin formula, Bernoulli numbers, the zeta function, and real-variable analytic continuation. 2010-04-10.
$\endgroup$
5
  • $\begingroup$ What do you mean by asymptotic expansion? Do you mean that $\varepsilon \to 0^+$ or what other limit? And what do you mean by constant terms of an asymptotic expasion? Are you looking for the power series around $0$ or what? (spoiler: these have no MacLaurin series since they don't converge for negative $\varepsilon$) $\endgroup$
    – Crostul
    Commented Oct 9, 2022 at 21:21
  • $\begingroup$ @Crostul Edited. $\endgroup$
    – user76284
    Commented Oct 9, 2022 at 21:36
  • $\begingroup$ Isn't $f_3$ the same as $f_2$? Both can be written as $\sum_{p\text{ prime}} pe^{-p\varepsilon}$, which would be the most natural way to write it (and a more natural object than $f_1$). To your question, $f_2$ is approximately $\sum_{p\le 1/\varepsilon} p$, which will have leading term $1/2\varepsilon^2$ but (even under RH) no better error term than $\varepsilon^{-3/2}$, and the smoothing factor won't really change that. $\endgroup$ Commented Oct 9, 2022 at 22:05
  • $\begingroup$ @GregMartin Your first point is right. Edited. $\endgroup$
    – user76284
    Commented Oct 9, 2022 at 22:10
  • $\begingroup$ @GregMartin I got a different asymptotic form, how did you obtain yours? $\endgroup$
    – Gary
    Commented Oct 12, 2022 at 2:37

1 Answer 1

3
$\begingroup$

Heuristically, using the prime number theorem, \begin{align*} & f_2 (\varepsilon ) = \sum\limits_{n = 1}^\infty {p_n \,\mathrm{e}^{ - p_n \varepsilon } } = \int_2^{ + \infty } {t\,\mathrm{e}^{ - t\varepsilon } \mathrm{d}\pi (t)} \sim \int_2^{ + \infty } {\frac{{t\,\mathrm{e}^{ - t\varepsilon } }}{{\log (t)}}\mathrm{d}t} \\ & = \frac{1}{{\varepsilon ^2 \log (1/\varepsilon )}}\int_{2\varepsilon} ^{ + \infty } {\frac{{s\,\mathrm{e}^{ - s} }}{{1 + \log (s)/\log (1/\varepsilon )}}\mathrm{d}s} \\ & \sim \frac{1}{{\varepsilon ^2 \log (1/\varepsilon )}}\sum\limits_{n = 0}^\infty {\frac{{( - 1)^n }}{{\log ^n (1/\varepsilon )}}\int_{2\varepsilon} ^{ + \infty } {s\,\mathrm{e}^{ - s} \log ^n (s)\,\mathrm{d}s} } \\ & \sim \frac{1}{{\varepsilon ^2 \log (1/\varepsilon )}}\sum\limits_{n = 0}^\infty {\frac{{( - 1)^n }}{{\log ^n (1/\varepsilon )}}\int_0^{ + \infty } {s\,\mathrm{e}^{ - s} \log ^n (s)\,\mathrm{d}s} } \\ & = \frac{1}{{\varepsilon ^2 \log (1/\varepsilon )}}\left( {1 + \frac{{\gamma - 1}}{{\log (1/\varepsilon )}} + \frac{{\frac{{\pi ^2 }}{6} + \gamma ^2 - 2\gamma }}{{\log ^2 (1/\varepsilon )}} + \frac{{\frac{{\pi ^2 }}{2}(\gamma - 1) + \gamma ^2 (\gamma - 3) + 2\zeta (3)}}{{\log ^3 (1/\varepsilon )}} + \ldots } \right) \end{align*} as $\varepsilon \to 0^+$. The error should be exponentially small compared to any of the terms in the series. Here $\pi(t)$ is the prime counting function, $\gamma$ is the Euler–Mascheroni constant, and $\zeta(3)$ is Apéry's constant. For a numerical example consider $\varepsilon =5 \cdot 10^{-5}$. With this value, $f_2(\varepsilon)\approx 1.8512832 \cdot 10^{10}$, whereas the approximation, with the terms neglected after order $\log ^{-3} (1/\varepsilon )$, gives $\approx 1.8524638\cdot 10^{10}$.

Note that the coefficients in the asymptotic expansion may be obtained via the exponential generating function $$ \exp\! \bigg( {(\gamma - 1)z + \sum\limits_{n = 2}^\infty {\frac{{\zeta (n) - 1}}{n}z^n } } \bigg) = 1 + \frac{{\gamma - 1}}{{1!}}z + \frac{{\frac{{\pi ^2 }}{6} + \gamma ^2 - 2\gamma }}{{2!}}z^2 + \ldots , $$ with $|z|<2$. This relation may be proved using the observation $ \int_0^{ + \infty }s\,\mathrm{e}^{ - s} \log ^n (s)\,\mathrm{d}s = \Gamma ^{(n)} (2) $. The latter also leads to the simple asymptotic approximation $$ f_2 (\varepsilon ) \sim \int_0^1 {\frac{{\Gamma (1 + s)}}{{\varepsilon ^{1 + s} }}{\rm d}s} $$ as $\varepsilon \to 0^+$.

Addendum. I shall show that the error coming from the first approximating step can be absorbed into any of the error terms of the final asymptotic expansion. We use the prime number theorem in the form $$ \pi (t) = \int_2^t {\frac{{{\rm d}s}}{{\log (s)}}} + R(t),\quad R(t) = \mathcal{O}\!\left( {\frac{t}{{\log ^{N + 2} (t)}}} \right) $$ where $N$ is an arbitrary fixed positive integer. Then \begin{align*} f_2 (\varepsilon ) & = \int_2^{ + \infty } {\frac{{t\,\mathrm{e}^{ - t\varepsilon } }}{{\log (t)}}{\rm d}t} + \int_2^{ + \infty } {t\,\mathrm{e}^{ - t\varepsilon } {\rm d}R(t)} \\ & = \int_2^{ + \infty } {\frac{{t\,\mathrm{e}^{ - t\varepsilon } }}{{\log (t)}}{\rm d}t} + \int_2^{ + \infty } {(\varepsilon t - 1)\,\mathrm{e}^{ - t\varepsilon } R(t)\,{\rm d}t} + \mathcal{O}(1). \end{align*} But \begin{align*} \int_2^{ + \infty } {(\varepsilon t - 1)\,\mathrm{e}^{ - t\varepsilon } R(t){\rm d}t} & = \mathcal{O}(1)\int_2^{ + \infty } {\frac{{t(\varepsilon t - 1)\,\mathrm{e}^{ - t\varepsilon } }}{{\log ^{N + 2} (t)}}{\rm d}t} \\ & = \mathcal{O}(1)\frac{1}{{\varepsilon ^2 \log ^{N + 2} (1/\varepsilon )}}\int_{2\varepsilon}^{ + \infty } {\frac{{s(s - 1)\,\mathrm{e}^{ - s} }}{{(1 + \log (s)/\log (1/\varepsilon ))^{N + 2} }}{\rm d}s} \\ & = \frac{1}{{\varepsilon ^2 \log ^2 (1/\varepsilon )}}\mathcal{O}\!\left( {\frac{1}{{\log ^N (1/\varepsilon )}}} \right) \end{align*} and the claim follows.

$\endgroup$
9
  • $\begingroup$ The constant term is absorbed in the error term of the expansion at any finite order. Perhaps using the Riemann explicit formula for $\pi(x)$ and employing least term truncation would reveal the constant term, but that seems rather cumbersome. The precise distribution of the primes is not that simple (Riemann zeros), therefore you cannot expect a simple expansion like in your very first example. $\endgroup$
    – Gary
    Commented Oct 11, 2022 at 23:24
  • $\begingroup$ "The error should be exponentially small compared to any of the terms in the series."—This is probably false, but it depends on what you mean both by "exponentially small" (in what quantity) and "should" (provable or not?). $\endgroup$ Commented Oct 12, 2022 at 4:54
  • $\begingroup$ @GregMartin I meant compared to $\log(1/ \varepsilon)$, so something like $1/(\varepsilon\log(1/\varepsilon))$ in total compared to $1/(\varepsilon^2 \log(1/\varepsilon))$. The numerical tests are promising. A proper proof using the PNT with remainder should not be too difficult. $\endgroup$
    – Gary
    Commented Oct 12, 2022 at 5:13
  • $\begingroup$ I believe that would require a power-savings error term in PNT, but that's far from what we can prove at present. Numerical tests will certainly reflect the probable truth of the Riemann Hypothesis, but that's different from what's known. $\endgroup$ Commented Oct 12, 2022 at 6:34
  • 1
    $\begingroup$ @Rahul You mean $R(t)=\mathcal{O}\!\left(\frac{t}{\mathrm{e}^{c\sqrt{\log(t)}}}\right)$? $\endgroup$
    – Gary
    Commented Oct 12, 2022 at 10:27

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .