268
$\begingroup$

I'm supposed to calculate:

$$\lim_{n\to\infty} e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!}$$

By using WolframAlpha, I might guess that the limit is $\frac{1}{2}$, which is a pretty interesting and nice result. I wonder in which ways we may approach it.

$\endgroup$
6

9 Answers 9

192
$\begingroup$

The probabilistic way:

This is $P[N_n\leqslant n]$ where $N_n$ is a random variable with Poisson distribution of parameter $n$. Hence each $N_n$ is distributed like $X_1+\cdots+X_n$ where the random variables $(X_k)$ are independent and identically distributed with Poisson distribution of parameter $1$.

By the central limit theorem, $Y_n=\frac1{\sqrt{n}}(X_1+\cdots+X_n-n)$ converges in distribution to a standard normal random variable $Z$, in particular, $P[Y_n\leqslant 0]\to P[Z\leqslant0]$.

Finally, $P[Z\leqslant0]=\frac12$ and $[N_n\leqslant n]=[Y_n\leqslant 0]$ hence $P[N_n\leqslant n]\to\frac12$, QED.


The analytical way, completing your try:

Hence, I know that what I need to do is to find $\lim\limits_{n\to\infty}I_n$, where $$ I_n=\frac{e^{-n}}{n!}\int_{0}^n (n-t)^ne^tdt.$$

To begin with, let $u(t)=(1-t)e^t$, then $I_n=\dfrac{e^{-n}n^n}{n!}nJ_n$ with $$ J_n=\int_{0}^1 u(t)^n\mathrm dt. $$ Now, $u(t)\leqslant\mathrm e^{-t^2/2}$ hence $$ J_n\leqslant\int_0^1\mathrm e^{-nt^2/2}\mathrm dt\leqslant\int_0^\infty\mathrm e^{-nt^2/2}\mathrm dt=\sqrt{\frac{\pi}{2n}}. $$ Likewise, the function $t\mapsto u(t)\mathrm e^{t^2/2}$ is decreasing on $t\geqslant0$ hence $u(t)\geqslant c_n\mathrm e^{-t^2/2}$ on $t\leqslant1/n^{1/4}$, with $c_n=u(1/n^{1/4})\mathrm e^{-1/(2\sqrt{n})}$, hence $$ J_n\geqslant c_n\int_0^{1/n^{1/4}}\mathrm e^{-nt^2/2}\mathrm dt=\frac{c_n}{\sqrt{n}}\int_0^{n^{1/4}}\mathrm e^{-t^2/2}\mathrm dt=\frac{c_n}{\sqrt{n}}\sqrt{\frac{\pi}{2}}(1+o(1)). $$ Since $c_n\to1$, all this proves that $\sqrt{n}J_n\to\sqrt{\frac\pi2}$. Stirling formula shows that the prefactor $\frac{e^{-n}n^n}{n!}$ is equivalent to $\frac1{\sqrt{2\pi n}}$. Regrouping everything, one sees that $I_n\sim\frac1{\sqrt{2\pi n}}n\sqrt{\frac\pi{2n}}=\frac12$.

Moral: The probabilistic way is shorter, easier, more illuminating, and more fun.

Caveat: My advice in these matters is, clearly, horribly biased.

$\endgroup$
2
  • 2
    $\begingroup$ One could simply say that for natural $n$ the median for the Poisson variable is $n$ (which, I believe, is nontrivial), and thus $P[N_n<n]=0.5$ immediately. $\endgroup$
    – user35953
    Commented May 18, 2020 at 18:48
  • 6
    $\begingroup$ @Did, you are the reason we don't have Nobel prize in mathematics: they are too envious) $\endgroup$
    – Alex
    Commented Jul 21, 2020 at 10:25
168
$\begingroup$

Edited. I justified the application of the dominated convergence theorem.

By a simple calculation,

$$ \begin{align*} e^{-n}\sum_{k=0}^{n} \frac{n^k}{k!} &= \frac{e^{-n}}{n!} \sum_{k=0}^{n}\binom{n}{k} n^k (n-k)! \\ (1) \cdots \quad &= \frac{e^{-n}}{n!} \sum_{k=0}^{n}\binom{n}{k} n^k \int_{0}^{\infty} t^{n-k}e^{-t} \, dt\\ &= \frac{e^{-n}}{n!} \int_{0}^{\infty} (n+t)^{n}e^{-t} \, dt \\ (2) \cdots \quad &= \frac{1}{n!} \int_{n}^{\infty} t^{n}e^{-t} \, dt \\ &= 1 - \frac{1}{n!} \int_{0}^{n} t^{n}e^{-t} \, dt \\ (3) \cdots \quad &= 1 - \frac{\sqrt{n} (n/e)^n}{n!} \int_{0}^{\sqrt{n}} \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \, du. \end{align*}$$

We remark that

  1. In $\text{(1)}$, we utilized the famous formula $ n! = \int_{0}^{\infty} t^n e^{-t} \, dt$.
  2. In $\text{(2)}$, the substitution $t + n \mapsto t$ is used.
  3. In $\text{(3)}$, the substitution $t = n - \sqrt{n}u$ is used.

Then in view of the Stirling's formula, it suffices to show that

$$\int_{0}^{\sqrt{n}} \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \, du \xrightarrow{n\to\infty} \sqrt{\frac{\pi}{2}}.$$

The idea is to introduce the function

$$ g_n (u) = \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \mathbf{1}_{(0, \sqrt{n})}(u) $$

and apply pointwise limit to the integrand as $n \to \infty$. This is justified once we find a dominating function for the sequence $(g_n)$. But notice that if $0 < u < \sqrt{n}$, then

$$ \log g_n (u) = n \log \left(1 - \frac{u}{\sqrt{n}} \right) + \sqrt{n} u = -\frac{u^2}{2} - \frac{u^3}{3\sqrt{n}} - \frac{u^4}{4n} - \cdots \leq -\frac{u^2}{2}. $$

From this we have $g_n (u) \leq e^{-u^2 /2}$ for all $n$ and $g_n (u) \to e^{-u^2 / 2}$ as $n \to \infty$. Therefore by dominated convergence theorem and Gaussian integral,

$$ \int_{0}^{\sqrt{n}} \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \, du = \int_{0}^{\infty} g_n (u) \, du \xrightarrow{n\to\infty} \int_{0}^{\infty} e^{-u^2/2} \, du = \sqrt{\frac{\pi}{2}}. $$

$\endgroup$
1
  • 6
    $\begingroup$ Your second equation is closely related to this question (which I answered). $\endgroup$
    – robjohn
    Commented Jun 19, 2012 at 15:47
67
$\begingroup$

Integration by parts yields $$ \frac{1}{k!}\int_x^\infty e^{-t}\,t^k\,\mathrm{d}t=\frac{1}{k!}x^ke^{-x}+\frac{1}{(k-1)!}\int_x^\infty e^{-t}\,t^{k-1}\,\mathrm{d}t\tag{1} $$ Iterating $(1)$ gives $$ \frac{1}{n!}\int_x^\infty e^{-t}\,t^n\,\mathrm{d}t=e^{-x}\sum_{k=0}^n\frac{x^k}{k!}\tag{2} $$ Thus, we get $$ e^{-n}\sum_{k=0}^n\frac{n^k}{k!}=\frac{1}{n!}\int_n^\infty e^{-t}\,t^n\,\mathrm{d}t\tag{3} $$ Now, I will reproduce part of the argument I give here, which develops a full asymptotic expansion. Additionally, I include some error estimates that were previously missing. $$ \begin{align} \int_n^\infty e^{-t}\,t^n\,\mathrm{d}t &=n^{n+1}e^{-n}\int_0^\infty e^{-ns}\,(s+1)^n\,\mathrm{d}s\\ &=n^{n+1}e^{-n}\int_0^\infty e^{-n(s-\log(1+s)}\,\mathrm{d}s\\ &=n^{n+1}e^{-n}\int_0^\infty e^{-nu^2/2}\,s'\,\mathrm{d}u\tag{4} \end{align} $$ where $t=n(s+1)$ and $u^2/2=s-\log(1+s)$.

Note that $\frac{ss'}{1+s}=u$; thus, when $s\ge1$, $s'\le2u$. This leads to the bound $$ \begin{align} \int_{s\ge1} e^{-nu^2/2}\,s'\,\mathrm{d}u &\le\int_{3/4}^\infty e^{-nu^2/2}\,2u\,\mathrm{d}u\\ &=\frac2ne^{-\frac98n}\tag{5} \end{align} $$ $(5)$ also show that $$ \int_{s\ge1}e^{-nu^2/2}\,\mathrm{d}u\le\frac2ne^{-\frac98n}\tag{6} $$

For $|s|<1$, we get $$ u^2/2=s-\log(1+s)=s^2/2-s^3/3+s^4/4-\dots\tag{7} $$ We can invert the series to get $s'=1+\frac23u+O(u^2)$. Therefore, $$ \begin{align} \int_0^\infty e^{-nu^2/2}\,s'\,\mathrm{d}u &=\int_{s\in[0,1]} e^{-nu^2/2}\,s'\,\mathrm{d}u+\color{red}{\int_{s>1} e^{-nu^2/2}\,s'\,\mathrm{d}u}\\ &=\int_0^\infty\left(1+\frac23u\right)e^{-nu^2/2}\,\mathrm{d}u-\color{darkorange}{\int_{s>1}\left(1+\frac23u\right)e^{-nu^2/2}\,\mathrm{d}u}\\ &+\int_0^\infty e^{-nu^2/2}\,O(u^2)\,\mathrm{d}u-\color{darkorange}{\int_{s>1} e^{-nu^2/2}\,O(u^2)\,\mathrm{d}u}\\ &+\color{red}{\int_{s>1} e^{-nu^2/2}\,s'\,\mathrm{d}u}\\ &=\sqrt{\frac{\pi}{2n}}+\frac2{3n}+O\left(n^{-3/2}\right)\tag{8} \end{align} $$ The red and orange integrals decrease exponentially by $(5)$ and $(6)$.

Plugging $(8)$ into $(4)$ yields $$ \int_n^\infty e^{-t}\,t^n\,\mathrm{d}t=\left(\sqrt{\frac{\pi n}{2}}+\frac23\right)\,n^ne^{-n}+O(n^{n-1/2}e^{-n})\tag{9} $$ The argument above can be used to prove Stirling's approximation, which says that $$ n!=\sqrt{2\pi n}\,n^ne^{-n}+O(n^{n-1/2}e^{-n})\tag{10} $$ Combining $(9)$ and $(10)$ yields $$ \begin{align} e^{-n}\sum_{k=0}^n\frac{n^k}{k!} &=\frac{1}{n!}\int_n^\infty e^{-t}\,t^n\,\mathrm{d}t\\ &=\frac12+\frac{2/3}{\sqrt{2\pi n}}+O(n^{-1})\tag{11} \end{align} $$

$\endgroup$
4
  • 3
    $\begingroup$ Would the downvoter care to comment (he asked, expecting the answer "no")? $\endgroup$
    – robjohn
    Commented May 12, 2018 at 14:01
  • $\begingroup$ hey rob, sadly the link in your answer is dead :( $\endgroup$
    – tired
    Commented Jun 10, 2018 at 14:17
  • $\begingroup$ Yeah too many downvoters here :/ I upvoted this. Very Nice answer. $\endgroup$
    – mick
    Commented Jun 29, 2018 at 21:49
  • 5
    $\begingroup$ The asymptotic approximation in $(11)$ can be improved by extending the approximations to get $$ \frac12+\frac1{\sqrt{2\pi n}}\left(\frac23-\frac{23}{270n}+\frac{23}{3024n^2}+\frac{259}{77760n^3}+O\!\left(\frac1{n^4}\right)\right) $$ and the big-$O$ term is close to $-\frac1{900n^4}$. $\endgroup$
    – robjohn
    Commented Mar 17, 2020 at 18:42
29
$\begingroup$

$\newcommand{\+}{^{\dagger}} \newcommand{\angles}[1]{\left\langle #1 \right\rangle} \newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace} \newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack} \newcommand{\dd}{{\rm d}} \newcommand{\isdiv}{\,\left.\right\vert\,} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\equalby}[1]{{#1 \atop {= \atop \vphantom{\huge A}}}} \newcommand{\expo}[1]{\,{\rm e}^{#1}\,} \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,} \newcommand{\half}{{1 \over 2}} \newcommand{\ic}{{\rm i}} \newcommand{\imp}{\Longrightarrow} \newcommand{\ket}[1]{\left\vert #1\right\rangle} \newcommand{\pars}[1]{\left( #1 \right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,#2\,}\,} \newcommand{\sech}{\,{\rm sech}} \newcommand{\sgn}{\,{\rm sgn}} \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}} \newcommand{\verts}[1]{\left\vert #1 \right\vert} \newcommand{\yy}{\Longleftrightarrow}$ \begin{align}&\color{#00f}{ \lim_{n \to \infty}\bracks{\expo{-n}\sum_{k = 0}^{n}{n^{k} \over k!}}} \\[3mm]&=\lim_{n \to \infty}\bracks{\expo{-n}\sum_{k = 0}^{n} \exp\pars{k\ln\pars{n} - \ln\pars{k!}}} \\[3mm]&= \lim_{n \to \infty}\braces{\expo{-n}\sum_{k = 0}^{n} \exp\pars{n\ln\pars{n} - \ln\pars{n!} - {1 \over 2n}\bracks{k - n}^{2}}} \\[3mm]&= \lim_{n \to \infty}\braces{\expo{-n}\,{n^{n} \over n!}\sum_{k = 0}^{n} \exp\pars{-{1 \over 2n}\bracks{k - n}^{2}}} \\[3mm]&= \lim_{n \to \infty}\braces{{\expo{-n}n^{n} \over n!}\int_{0}^{n} \exp\pars{-{1 \over 2n}\bracks{k - n}^{2}}\,\dd k} \\[3mm]&= \lim_{n \to \infty}\bracks{{\expo{-n}n^{n} \over n!}\int_{-n}^{0} \exp\pars{-\,{k^{2} \over 2n}}\,\dd k} = \lim_{n \to \infty}\bracks{{\expo{-n}n^{n} \over n!}\,\root{2n} \int_{-\root{n}/2}^{0}\exp\pars{-k^{2}}\,\dd k} \\[3mm]&= \lim_{n \to \infty}\bracks{{\root{2}n^{n + 1/2}\expo{-n} \over n!} \int_{-\infty}^{0}\exp\pars{-k^{2}}\,\dd k} = \lim_{n \to \infty}\bracks{{\root{2}n^{n + 1/2}\expo{-n} \over n!} \,{\root{\pi} \over 2}} \\[3mm]&= \half\,\lim_{n \to \infty}\bracks{{\root{2\pi}n^{n + 1/2}\expo{-n} \over n!}} =\color{#00f}{\Large\half} \end{align}

$\endgroup$
10
  • 18
    $\begingroup$ The second equal sign is a complete mystery. The passage from a sum to an integral also needs justification but it probably holds. $\endgroup$
    – Did
    Commented Mar 31, 2014 at 16:25
  • 9
    $\begingroup$ Would you care answering @Did 's question regarding the second equal sign? I am miffed as well. Thank you, Felix Marin. $\endgroup$
    – Hans
    Commented May 11, 2015 at 17:35
  • 4
    $\begingroup$ @Did's question has to be answered otherwise this seems to be an suitable answer $\endgroup$
    – tired
    Commented Oct 4, 2015 at 12:12
  • 3
    $\begingroup$ yeah for me too, can you please answer @Did 's question? $\endgroup$
    – user153330
    Commented Mar 1, 2016 at 20:40
  • 5
    $\begingroup$ Is this supposed to justify the various unsubstantiated claims this post relies on? It does not. $\endgroup$
    – Did
    Commented Feb 8, 2018 at 17:56
28
$\begingroup$

The sum is related to the partial exponential sum, and thus to the incomplete gamma function, $$\begin{eqnarray*} e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!} &=& e^{-n} e_n(n) \\ &=& \frac{\Gamma(n+1,n)}{\Gamma(n+1)}, \end{eqnarray*}$$ since $e_n(x) = \sum_{k=0}^n x^k/k! = e^x \Gamma(n+1,x)/\Gamma(n+1)$. But $$\begin{eqnarray*} \Gamma(n+1,n) &=& \sqrt{2\pi}\, n^{n+1/2}e^{-n}\left(\frac{1}{2} + \frac{1}{3}\sqrt{\frac{2}{n\pi}} + O\left(\frac{1}{n}\right) \right). \end{eqnarray*}$$ The first term in the asymptotic expansion for $\Gamma(n+1,n)$ can be found by applying the saddle point method to $$\Gamma(n+1,n) = \int_n^\infty dt\, t^n e^{-t}.$$ The higher order terms are in principle straightforward to compute. Using Stirling's approximation, we find $$e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!} = \frac{1}{2} + \frac{1}{3}\sqrt{\frac{2}{n\pi}} + O\left(\frac{1}{n}\right).$$ Thus, the limit is $1/2$, as found by @sos440 and @robjohn. This limit is a special case of DLMF 8.11.13.

I just noticed a comment that suggests this be done using high school level math. If this is a standard exercise at your high school, maybe they covered the incomplete gamma function! ;-)

$\endgroup$
3
  • 4
    $\begingroup$ (+1) I derived this asymptotic expansion here in answer to a question on sci.math. I computed a few terms past $\frac12$: $$ \frac12+\frac{1}{\sqrt{2\pi n}}\left(\frac23-\frac{23}{270n}+\frac{23}{3024n^2}+\dots\right) $$ $\endgroup$
    – robjohn
    Commented Jun 20, 2012 at 3:14
  • $\begingroup$ @robjohn: Thanks for the link, I'll have a look. By the way, I voted up your nice solution a couple of hours ago. I like your short and sweet derivation of (3). $\endgroup$
    – user26872
    Commented Jun 20, 2012 at 3:47
  • $\begingroup$ @robjohn As steven gregory says above "I wish you had copied the article here. It seemed small enough. References have a habit of pointing to NULL over time" . Please, could you upload somewhere again? I'm curious about it. $\endgroup$
    – vesszabo
    Commented Jul 4, 2017 at 21:13
27
$\begingroup$

If you'd like to see formal solution using calculus methods check this article http://www.emis.de/journals/AMAPN/vol15/voros.pdf

$\endgroup$
2
  • 5
    $\begingroup$ I wish you had copied the article here. It seemed small enough. References have a habit of pointing to NULL over time. $\endgroup$ Commented Aug 22, 2015 at 5:18
  • 1
    $\begingroup$ Please try to describe as much here as possible in order to make the answer self-contained. Links are fine as support, but they can go stale and then an answer which is nothing more than a link loses its value. Please read this post. $\endgroup$
    – robjohn
    Commented Jun 20, 2019 at 3:14
13
$\begingroup$

I do not know how much this will help you.

For a given $n$, the result is $\dfrac{\Gamma(n+1,n)}{n\ \Gamma(n)}$ which has a limit equal to $\dfrac12$ as $n\to\infty$.

$\endgroup$
0
8
$\begingroup$

On this page there is a nice collection of evidence.

I add another proof which also uses the Stirling formula.

$\displaystyle e^{-n}\sum\limits_{k=0}^n\frac{n^k}{k!} = e^{-n}\sum\limits_{k=0}^n\frac{k^k (n-k)^{n-k}}{k!(n-k)!} \hspace{4cm}$ e.g. here

$\displaystyle \lim\limits_{n\to\infty} e^{-n}\sum\limits_{k=1}^{n-1}\frac{e^k e^{n-k}}{\sqrt{2\pi k (1+\mathcal{O}(1/k))}\sqrt{2\pi (n-k)(1+\mathcal{O}(1/(n-k)))}} $

$\displaystyle = \lim\limits_{n\to\infty} \frac{1}{2\pi}\frac{1}{n}\sum\limits_{k=1}^{n-1}\frac{1}{\sqrt{\frac{k}{n}\left(1-\frac{k}{n}\right)}} =\frac{1}{2\pi} \int\limits_0^1\frac{dx}{\sqrt{x(1-x)}}=\frac{\Gamma(\frac{1}{2})^2}{2\pi~\Gamma(1)} = \frac{1}{2}$

$\endgroup$
1
  • $\begingroup$ (+1) Interesting idea. $\endgroup$ Commented Nov 26, 2018 at 22:23
8
$\begingroup$

I thought that it might be of instructive to post a solution to a generalization of the OP's question. Namely, evaluate the limit

$$\lim_{n\to\infty}e^{-n}\sum_{k=0}^{N(n)}\frac{n^k}{k!}$$

where $N(n)=\lfloor Cn\rfloor$, where $C>0$ is an arbitrary constant. To that end we now proceed.


Let $N(n)=\lfloor Cn\rfloor$, where $C>0$ is an arbitrary constant. We denote $S(n)$ the sum of interest

$$S(n)=e^{-n}\sum_{k=0}^{N}\frac{n^k}{k!}$$

Applying the analogous methodology presented by @SangchulLee, it is straightforward to show that

$$S(n)=1-\frac{(N/e)^{N}\sqrt{N}}{N!}\int_{(N-n)/\sqrt{N}}^{\sqrt{N}}e^{\sqrt{N}x}\left(1-\frac{x}{\sqrt N}\right)^N\,dx\tag7$$

We note that the integrand is positive and bounded above by $e^{-x^2/2}$. Therefore, we can apply the Dominated Convergence Theorem along with Stirling's Formula to evaluate the limit as $n\to\infty$.

There are three cases to examine.

Case $1$: $C>1$

If $C>1$, then both the lower and upper limits of integration on the integral in $(7)$ approach $\infty$ as $n\to \infty$. Therefore, we find

$$\lim_{n\to \infty}e^{-n}\sum_{k=0}^{\lfloor Cn\rfloor}\frac{n^k}{k!}=1$$

Case $2$: $C=1$

If $C=1$, then the lower limit is $0$ while the upper limit approaches $\infty$ and we find

$$\lim_{n\to \infty}e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!}=\frac12$$

Case $3$: $C<1$

If $C<1$, then the lower limit is approaches $-\infty$ while the upper limit approaches $\infty$ and we find

$$\lim_{n\to \infty}e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!}=0$$


To summarize we have found that

$$\bbox[5px,border:2px solid #C0A000]{\lim_{n\to \infty}e^{-n}\sum_{k=0}^{\lfloor Cn\rfloor}\frac{n^k}{k!}=\begin{cases}1&,C>1\\\\\frac12&, C=1\\\\0&, C<1\end{cases}}$$

$\endgroup$
3
  • $\begingroup$ This is clever and I wonder if you could take it further: if $\lim\limits_{n\to\infty}e^{-n}\sum_{k=0}^{n+g_C(n)} = C$, can you characterize $g_C(n)$ at all, along the lines of things like the critical-value results for connectivity in random graph models? $\endgroup$ Commented Sep 23, 2020 at 23:06
  • $\begingroup$ @StevenStadnicki First, thank you! Much appreciated. And yes, I thought about further generalizations, similar to the one you suggest, which should be fairly straightforward. $\endgroup$
    – Mark Viola
    Commented Sep 23, 2020 at 23:18
  • 3
    $\begingroup$ On the off chance you haven't seen it before, a further generalization is $$\lim_{n\to\infty} e^{-n} \sum_{k=0}^{n + \lfloor z\sqrt{n}\rfloor} \frac{n^k}{k!} = \Phi(z)$$ where $\Phi$ is the standard normal CDF. Your results then follow by direct comparison from $$\Phi(-\infty) = 0, \Phi(0) = \frac{1}{2}, \Phi(+\infty) = 1$$ $\endgroup$ Commented Dec 9, 2020 at 5:55

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .