130
$\begingroup$

Quite often, mathematics students become surprised by the fact that for a mathematician, the term “logarithm” and the expression $\log$ nearly always mean natural logarithm instead of the common logarithm. Because of that, I have been gathering examples of problems whose statement have nothing to do with logarithms (or the exponential function), but whose solution does involve natural logarithms. The goal is, of course, to make the students see how natural the natural logarithms really are. Here are some of these problems:

  1. The sum of the series $1-\frac12+\frac13-\frac14+\cdots$ is $\log2$.
  2. If $x\in(0,+\infty)$, then $\lim_{n\to\infty}n\bigl(\sqrt[n]x-1\bigr)=\log x$.
  3. What's the average distance from a point of a square with the side of length $1$ to the center of the square? The question is ambiguous. Is the square a line or a two-dimensional region? In the first case, the answer is $\frac14\bigl(\sqrt2+\log\bigl(1+\sqrt2\bigr)\bigr)$; in the second case, the answer is smaller (of course): $\frac16\bigl(\sqrt2+\log\bigl(1+\sqrt2\bigr)\bigr)$.
  4. The length of an arc of a parabola can be expressed using logarithms.
  5. The area below an arc of the hyperbola $y=\frac1x$ (and above the $x$-axis) can be expressed using natural logarithms.
  6. Suppose that there is an urn with $n$ different coupons, from which coupons are being collected, equally likely, with replacement. How many coupons do you expect you need to draw (with replacement) before having drawn each coupon at least once? The answer is about $n\log(n)+\gamma n+\frac12$, where $\gamma$ is the Euler–Mascheroni constant.
  7. For each $n\in\mathbb N$, let $P_p(n)$ be the number of primitive Pythagorean triples whose perimeter is smaller than $n$. Then $\displaystyle P_p(n)\sim\frac{n\log2}{\pi^2}$. (By the way, this is also an unexpected use of $\pi$.)

Could you please suggest some more?

$\endgroup$
20
  • 20
    $\begingroup$ I always thought $\log_{10}$ was written $\operatorname{lg}$ ... $\endgroup$ Commented Jul 3, 2017 at 13:20
  • 13
    $\begingroup$ For me, $log = ln$ seems to be common for people working in analysis. If you are looking, for example, at computer science, you would almost always have $log = ld$ - and I assume there are also examples of fields where $log = lg$ is the most common or where $log$ is not specified at all (e.g. "logarithmic scale", "logarithmic running time"). Mathematics is a very wide area, stretching out into many other fields; and I think you can't simply state that $log = ln$ is most common for all these fields. $\endgroup$
    – Dirk
    Commented Jul 3, 2017 at 13:32
  • 14
    $\begingroup$ The usage in the English world is really surprising. Why don't you just write $ln$ for the natural logarithm? It's even shorter. I would use $log$ only if I want to specify other base or if the base doesn't matter at all. $\endgroup$
    – Džuris
    Commented Jul 3, 2017 at 16:51
  • 34
    $\begingroup$ As an aside, in computer science, $\log_2$ is the most prevalent logarithm. $\endgroup$
    – user14972
    Commented Jul 3, 2017 at 17:07
  • 19
    $\begingroup$ @HagenvonEitzen You're introducing more confusion :-) $\log_{10}$ is never written $\lg$ AFAIK; the notation $\lg$ is used for $\log_2$. $\endgroup$ Commented Jul 4, 2017 at 18:40

27 Answers 27

112
$\begingroup$

What about the Prime Number Theorem? The number of primes smaller than $x$ is denoted by $\pi (x)$ and you have $$\pi (x) \sim \frac{x}{\log x}$$

$\endgroup$
6
  • 2
    $\begingroup$ Is that meant to be a proportionality symbol? $\endgroup$
    – minseong
    Commented Jul 4, 2017 at 6:27
  • 17
    $\begingroup$ The symbol means "in limit to infinity" here, no proportionality involved. The logarithm should actually be a natural logarithm, making this a great answer to the question. $\endgroup$
    – tomsmeding
    Commented Jul 4, 2017 at 7:30
  • 5
    $\begingroup$ Agreed with answer and tomsmeding. This is asymptotic equivalence, not proportionality, and it is the natural logarithm that is used here. $\endgroup$ Commented Jul 4, 2017 at 10:29
  • 27
    $\begingroup$ I've always found it more enlightening to write the asymptotic relation in the Prime Number Theorem in the form $$\frac{\pi(x)}{x} \sim \frac{1}{\log x},$$ because now the fraction on the left is the proportion of primes among numbers up to $x$. $\endgroup$
    – murray
    Commented Jul 5, 2017 at 16:37
  • 6
    $\begingroup$ @murray: The one I always remember (I keep forgetting what π(x) even is, I'm not a mathematician) is that the nth prime tends to n log n. It's short and the one I'd probably care about the most. $\endgroup$
    – user541686
    Commented Jul 6, 2017 at 1:30
53
$\begingroup$

Your first point can be generalized. Write $[a_1,a_2,a_3,\dots]$ for $\sum a_n/n$. You wrote:$$[\overline{1,-1}]=\ln2.$$(The bar means repeat.) Then we also have:\begin{align}[\overline{1,1,-2}]&=\ln3,\\ [\overline{1,1,1,-3}]&=\ln4,\end{align}and in general:$$[\overline{\underbrace{1,1,\dots,1}_{n-1},1-n}]=\ln n.$$


As a side note, one can see that $\ln m+\ln n=\ln mn$ from this. For example, note that, from the definition, we have $[\overline{0,2,0,-2}]=[\overline{1,-1}]=\ln2$ (from doubling the numerators and denominators). We then have:\begin{align}\ln2+\ln2={}&[\overline{1,-1,1,-1}]+\\&[\overline{0,2,0,-2}]\\{}=&[\overline{1,1,1,-3}]=\ln4\end{align} Similarly: \begin{align}\ln2+\ln3={}&[\overline{0,0,3,0,0,-3}]+\\&[\overline{1,1,-2,1,1,-2}]\\{}=&[\overline{1,1,1,1,1,-5}]=\ln6\end{align}

$\endgroup$
5
  • 1
    $\begingroup$ How would one prove the general representation of $\ln n$? $\endgroup$
    – Fizikus
    Commented Sep 1, 2018 at 8:41
  • 2
    $\begingroup$ @Fizikus Hint: See the answer by Hurkyl ($H_n\approx\ln(n)+\gamma$). Consider $H_{kn}-H_n$. $\endgroup$ Commented Sep 1, 2018 at 17:21
  • 1
    $\begingroup$ If we extend it backwards we get the trivial identity that $[\overline{0}] = 0\times\frac11 + 0\times\frac12 + 0\times\frac13 + \cdots = 0$! $\endgroup$ Commented Mar 23, 2019 at 22:25
  • 1
    $\begingroup$ @SolomonUcko ${}=\ln(1)$, as it should! $\endgroup$ Commented Mar 24, 2019 at 8:11
  • 2
    $\begingroup$ @AkivaWeinberger Oops, forgot that part... $\endgroup$ Commented Mar 24, 2019 at 13:50
51
$\begingroup$

Here are some of my favorites:

  • By "reversing" Euler's identity, $$\ln(\cos x+i\sin x)=ix$$

  • The natural log appears in some of the integrals of trigonometric functions: $$\int \tan (x) dx=\ln(\sec(x))+C$$ $$\int \cot (x) dx=\ln(\sin(x))+C$$ $$\int \sec (x) dx=\ln(\sec(x)+\tan(x))+C$$

  • The appearance of the natural logarithm in the Tsiolkovsky rocket equation: $$\Delta v=v_e\ln\frac{m_0}{m_f}$$

$\endgroup$
3
  • 13
    $\begingroup$ Be careful with the logarithms, $\ln(\cos x+i\sin x)$ has a period of $2\pi$, whilst $ix$ is not periodic at all. $\endgroup$ Commented Jul 7, 2017 at 13:01
  • 13
    $\begingroup$ It happens to be that $e^z$ is not a bijective function on $\mathbb C$. Indeed, $e^z=e^{z+2\pi i}$. One usually fixes this with what is called the principal value. $\endgroup$ Commented Jul 7, 2017 at 13:23
  • 3
    $\begingroup$ @FranklinPezzutiDyer: It would be better if you can edit you answer incorporating the suggestion by Simply Beautiful Art. Also, it made me sad that you didn't mention the integral of $\csc x.$ $\endgroup$
    – Bumblebee
    Commented Dec 6, 2020 at 1:41
36
$\begingroup$

The continuous solution of the functional equation $f(x\cdot y)=f(x)+f(y)$, with the condition $f'(1)=1$ is $f(x)=\ln (x)$.

Changing the value of $f'(1)$ we find the other logarithm functions.

$\endgroup$
5
  • 11
    $\begingroup$ This seems very closely related to rules of logarithm (to multiply, add logs) and therefore not terribly surprising. $\endgroup$ Commented Jul 3, 2017 at 14:51
  • 7
    $\begingroup$ Certainly not surprising. It can be used as a definition. $\endgroup$ Commented Jul 3, 2017 at 14:57
  • $\begingroup$ Can you please elaborate on "other logarithm functions"? $\endgroup$
    – spraff
    Commented Jul 5, 2017 at 8:55
  • 1
    $\begingroup$ If $f'(1)=a$ the solution of the functional equation becomes $f(x)=a \ln x=\log_{e^a} x$ $\endgroup$ Commented Jul 5, 2017 at 9:16
  • 1
    $\begingroup$ @spraff Other bases. $\endgroup$ Commented Jul 5, 2017 at 19:12
34
$\begingroup$

Here's another one related to some of your examples: the $n$-th harmonic number

$$ H_n = 1 + \frac{1}{2} + \ldots + \frac{1}{n} $$

satisfies

$$ H_n \approx \ln(n) + \gamma $$

where $\gamma$ is the Euler-Mascheroni constant. The error in the above approximation is slightly less than $\frac{1}{2n}$.

$\endgroup$
4
  • 7
    $\begingroup$ Another good one is that $\sum_{p\le n}p^{-1} \sim \ln \ln n$, where $p$ is prime and $n$ is an integer. $\endgroup$ Commented Jul 3, 2017 at 22:17
  • 2
    $\begingroup$ Notably, the coupon collection problem mentioned by the OP, is closely related to these harmonic numbers. The approximate solution he gave is simply the first few terms in the expansion of $H_n$, multiplied by $n$. $\endgroup$ Commented Jul 4, 2017 at 10:30
  • 7
    $\begingroup$ This looks closely related to "the antiderivative of $x^{-1}$ is $\ln x$." $\endgroup$
    – Kevin
    Commented Jul 4, 2017 at 17:29
  • 1
    $\begingroup$ As a consequence, $\lim_{k\to\infty}H_{nk}-H_k=\ln n$. (EDIT: Also, $H_{2n}-H_n$ is the partial sum of $\sum(-1)^{n+1}/n$, which means that OP's first fact is implied by your fact.) $\endgroup$ Commented Jul 5, 2017 at 19:14
26
$\begingroup$

Using $\sigma(n)$ as the sum of the (positive) divisors of a natural number $n,$ we have $$ \sigma(n) \leq e^\gamma \, n \, \log \log n + \frac{0.64821364942... \; n}{\log \log n},$$ with the constant in the numerator giving equality for $n=12.$ Here $\gamma = \lim H_n - \log n.$

As suggested by Oscar, we may write this without approximations as $$ \sigma(n) \leq e^\gamma \, n \, \log \log n + \frac{ n \; ( \log \log 12) \left(\frac{7}{3} -e^\gamma \,\log \log 12 \right)}{\log \log n}.$$

There are some numbers up to $n \leq 5040 \;$ (such as $n=12$) for which $ \sigma(n) > e^\gamma \, n \, \log \log n .$ The conjecture that, for $n > 5040,$ we have $ \sigma(n) < e^\gamma \, n \, \log \log n ,$ is equivalent to the Riemann Hypothesis.

Note that the occurrence of $\log \log n$ means that we cannot replace the natural logarithm by some other without changing the sense of the statement. We would not just be multiplying by a constant if we used a different logarithm.

$\endgroup$
1
  • 3
    $\begingroup$ The constant on the right side is rendered exactly as $(\log \log 12)(7/3-e^{\gamma}\log \log 12)$. $\endgroup$ Commented Jul 4, 2017 at 10:40
24
$\begingroup$

The law of the iterated logarithm states that $$ \limsup_{n \to \infty} \frac{X_1+\cdots+X_n}{\sqrt{n \log\log n}} = \sqrt 2 $$ almost surely, where $X_1,\ldots,X_n$ are iid random variables with means zero and unit variances.

$\endgroup$
20
$\begingroup$

Something I found some months back. Would be surprised if this hasn't been looked at before. No citations.

We say a set $S$ can express $n$ if it's possible to express $n$ as a potentially repeating sum of elements of $S$.

We say that a set $S$ is critical for $n$ if $S$ can express $n$ and no strict subset of $S$ can express $n$.

Let $u_n$ be the size of the largest subset of $\{1,2,\dotsc ,n\}$ that is critical for $n$. It's conjectured that $u_n$ grows like $\log_e n$.

Evidence: enter image description here

$\endgroup$
3
  • 5
    $\begingroup$ Post this as its own question. I'm curious what a solution for this would look like. $\endgroup$ Commented Jul 6, 2017 at 20:20
  • 1
    $\begingroup$ I know this was a long time ago, but was it ever posted as a question/was an answer obtained? $\endgroup$
    – RSpeciel
    Commented Jul 28, 2020 at 16:30
  • 1
    $\begingroup$ Why is not $\{1\}$ always critical for $n$? $\endgroup$
    – hdur
    Commented Apr 30, 2023 at 14:39
18
$\begingroup$

Consider phase transition in the Erdős-Rényi model $G(n, p)$. We have

The property that $G(n, p)$ has diameter two has a sharp threshold at $p = \sqrt{\frac{2\ln n}{n}}$.

That is, if $p$ is smaller than $\sqrt{\frac{2\ln n}{n}}$, then the probability that the diameter of $G(n, p)$ is greater than $2$ goes to $1$ in the limit, as $n$ goes to $\infty$; if $p$ is greater than $\sqrt{\frac{2\ln n}{n}}$, then the probability that the diameter of $G(n, p)$ is smaller than or equal to $2$ goes to $1$ as $n$ goes to $\infty$.

Another similar conclusion is

The disappearance of isolated vertices in $G(n, p)$ has a sharp threshold at $p = \frac{\ln n}{n}$.

$\endgroup$
17
$\begingroup$

How do you count connected labeled graphs on $n$ vertices?

Let's take the not-necessarily-connected case first. There are $\binom{n}{2}$ possible edges between the $n$ vertices, and for each you may include it or not. So there are $$2^\binom{n}{2}$$ possible graphs.

Now to count connected graphs, we need to do some "generatingfunctionology", to steal Wilf's term. Let $$f(x) = \sum_{n=0}^\infty 2^\binom{n}{2} \frac{x^n}{n!}$$ be the (formal) exponential generating function for labeled graphs. Then if $c_n$ is the number of connected graphs on $n$ vertices, we have

$$\sum_{n=1}^\infty c_n \frac{x^n}{n!} = \log f(x) = \log\sum_{n=0}^\infty 2^\binom{n}{2} \frac{x^n}{n!}.$$

This is astonishing the first time you see it, but it is very natural once you understand how exponentiation works on exponential generating functions.

$\endgroup$
16
$\begingroup$

$$ \frac{d}{dx}\,(x^x) = x^x \ (\ln(x)+1) $$

$\endgroup$
1
  • 6
    $\begingroup$ I honestly don't see why this is surprising. $x^x=e^{x\log x}$. $\endgroup$
    – TheSimpliFire
    Commented Jan 16, 2021 at 14:14
16
$\begingroup$

$$\sum_{k=1}^{\infty} \frac{k \mod{j}}{k(k+1)} = \log{j}, \: \forall j \in \mathbb{N}$$

$\endgroup$
3
  • $\begingroup$ very interesting: could you pls. show how to obtain it, or give a reference ? $\endgroup$
    – G Cab
    Commented Jun 24, 2018 at 21:10
  • $\begingroup$ @GCab d-scholarship.pitt.edu/7545 page 11 $\endgroup$
    – bloomers
    Commented Jun 24, 2018 at 22:10
  • $\begingroup$ Thanks indeed for the reference $\endgroup$
    – G Cab
    Commented Jun 25, 2018 at 0:50
15
$\begingroup$

Here is one containing a lot of $\log$s.

Consider the standard multiplication table, but with rows and columns indexed by $1$ to $N$ instead of $1$ to $10$. The question is, how many distinct integers are there among these? Perhaps surprisingly, the answer is asymptotically less than $N^2$. Ford has shown that the answer is, asymptotically, $$\frac{N^2}{(\log N)^{c_1}(\log\log N)^{3/2}},$$ where $c_1=1-\frac{1+\log\log 2}{\log 2}$. Similarly, if we were to consider $k+1$ dimensional multiplication table (defined in the obvious manner) the number of distinct integers in it is $$\frac{N^{k+1}}{(\log N)^{c_k}(\log\log N)^{3/2}},c_k=\frac{\log(k+1)+k\log k-k\log\log(k+1)-k}{\log(k+1)}.$$

$\endgroup$
15
$\begingroup$

I found it quite remarkable that $$\int\frac{1}{x\log(x)\log(\log(x))}dx = \log(\log(\log|x|))$$ But more generally, if $\log^{\circ i}(x)$ means $\log\underbrace\cdots_{i\text{ times}}\log x$, then $$\int\frac{dx}{x\prod_{i=1}^n{\log^{\circ i}(x)}} = \log^{\circ n+1}|x|, n\in\mathbb{N}$$ Indeed, $${\mathrm d\over\mathrm dx}\frac{1}{\log\log\log\log|x|}=\frac{1}{x\log(x)\log\log(x)\log\log\log(x)}$$

$\endgroup$
2
  • 1
    $\begingroup$ I would use either $\log^i(x)$ or $\log^{\circ i}(x)$ to avoid confusion with the base-$i$ logarithm. $\endgroup$ Commented Jul 5, 2017 at 19:27
  • $\begingroup$ @AkivaWeinberger: had the same thought. I'll try to improve it. $\endgroup$
    – edmz
    Commented Jul 5, 2017 at 19:37
14
$\begingroup$

Solve $x^n-x-1=0$ for various values of $n$ ($n\ge 2$). There will be one root greater than $1$ for each $n$. The asymptotic behavior of this root as $n$ increases without bound is given to two terms as:

$x=1+(\log 2)/n+o(1/n)$

$\endgroup$
13
$\begingroup$

In Calculus I the student learns how to find antiderviatives of $x^n$ for all integers $n \ne -1$. They scratch their heads and scream

"Give me the antiderivative of the inversion function $1/x$!"

OK you say, here it is:

$\ln(t)=\int _{1}^{t}{\frac {1}{x}}\,dx$

$\endgroup$
4
  • $\begingroup$ I see now you already have it listed - might as well keep my answer since it goes as it in a different way... $\endgroup$ Commented Jul 3, 2017 at 19:59
  • 5
    $\begingroup$ Note that $\displaystyle\int_1^xt^n\operatorname d\!t=\frac{x^{n+1}-1}{n+1}$, and $\displaystyle\lim_{n\to-1}\frac{x^{n+1}-1}{n+1}=\ln x$. Thus, this result is a limiting case of the integral power rule. $\endgroup$ Commented Jul 5, 2017 at 19:25
  • $\begingroup$ But this is one of the possible definitions of the $\ln$ function, and hence not surprising. $\endgroup$ Commented Jul 24, 2017 at 6:31
  • $\begingroup$ It might be surprising to a calculus student who knows about $f(x) = e^x$ and only that the log function is the inverted graph of $f$. In any event, it is fascinating that the 'puzzle pieces' fit together the way they do. $\endgroup$ Commented Jul 24, 2017 at 10:54
12
$\begingroup$

Perhaps the students would enjoy that the area of the unit circle may be expressed as $$ - \sqrt{-1} \log{(-1)} $$

$\endgroup$
4
  • 5
    $\begingroup$ In the immortal words of miss Eliza Doolittle: “Not bloody likely”. $\endgroup$ Commented Jul 19, 2017 at 15:22
  • $\begingroup$ @JoséCarlosSantos I'll stick to the weather and my health then... $\endgroup$
    – ekkilop
    Commented Jul 19, 2017 at 15:39
  • $\begingroup$ Lol that’s funny $\endgroup$ Commented Apr 18, 2023 at 18:49
  • $\begingroup$ @ekkilop, can you explain why the area of a unit circle is $-\sqrt{-1}\log{-1}$? $\endgroup$ Commented Feb 7 at 13:17
8
$\begingroup$

This is more about $e$ than the natural logarithm, but I was surprised that the maximum of $x^{1/x}$ was at $e$.

That comes up in studying the equation $a^b = b^a$ for $a, b \in \mathbb{R}$ with $a\ne b$.

$\endgroup$
6
$\begingroup$

Boltzmann's entropy equation:

$$S = k\ln{W}$$

$\endgroup$
2
  • 4
    $\begingroup$ This is not specifically about natural logarithms. I know that the standard way of expressing Boltzmann's entropy equation uses them but, with a different constant, we would be ablle to express the equation using, say, common logarithms or base $2$ logarithms. $\endgroup$ Commented Jul 6, 2017 at 10:46
  • $\begingroup$ True, but the constant is the Boltzmann constant which is $R/N_A$ and thus not very arbitrary. I find the relationships with those other constants to be cool. $\endgroup$ Commented Jul 6, 2017 at 11:21
5
$\begingroup$

I know it's a late reply, but I am a mathematician working on a fish farm (completely misplaced :D ) and, since you're from Porto (and I studied in Aveiro), you deserve another great application for $\ln x$.

The Specific Growth Rate ($SGR$) for an time interval with $d$ days of a farmed species is:

$$SGR=100\times\frac{\ln w_f - \ln w_i}{d}$$

Where $w_i$ is the initial average weight of the population (or of a single individual animal), $w_f$ is the final weight (again, the average of the entire population or of just one specimen).

This has tremendous production optimization applications, since you can estimate the Feed Conversion Ratio ($FCR$) using $SGR$:

$$FCR=\frac{SFR}{SGR}$$

Where $SFR$ is the Specific Feeding Rate (quantity of food over biomass).

I work with this for several years now, and I still don't fully understand the meaning of $\ln$ on the formula for $SGR$. But it works!

Natural logarithms on biology and fish production... What a world...

$\endgroup$
4
$\begingroup$

My favorite facts involving logarithms are indeed law of the iterated logarithm, coupon collector's problem and giant component, but they seem to be already mentioned in previous answers.

However, there is a lot of cute facts about logarithms, that were not mentioned yet:

1) The logarithm is used to express relationships between uniform continuous, exponential and Pareto distributions:

If $X \sim U(0,1)$ then $- {\lambda}^{-1} \ln {X} \sim Exp(\lambda)$

If $X \sim P(\lambda, t)$, then $\ln{\frac{X}{t}} \sim Exp(\lambda)$

2) $\exists c \in \mathbb{R}, B_n \leq {(\frac{cn}{\ln {(n+1)}})}^n$, where $B_n$ is the n-th Bell number.

3) The smallest possible independence number in an n-vertex triangle-free graph is $O({(n \ln{n})}^{\frac{1}{2}})$

4) $\frac{2(\ln n)}{\ln(\frac{1}{p})}$ is the size of the largest clique in almost every random graph with $n$ vertices and edge probability $p$

5) The number of groups of order $n$ is less, than $n^{\frac{(\ln n)^2}{2}}$

6) Every finite group $G$, whose composition factors are isomorphic neither to Steinberg groups, nor to Suzuki groups, nor to Ree groups, has a presentation of order $O((\ln |G|)^3)$

7) If $G$ is a non-abelian simple group, $\exists S (\langle S \rangle = G) \cup (|S| \leq 7) \cup (diam(Cay(G, S)) \leq {10}^{10}\ln|G|)$

8) There exists such a constant $C$, such that every finite group G has less than $\frac{C \ln |G|}{(\ln \ln |G|)^8}$ conjugacy classes.

9) And there are also several inequalities for Ramsey numbers, that involve logarithms:

$$\exists c \in \mathbb{R} \forall n \in \mathbb{N} \text{ }R(n, n) \leq n^{\frac{-c \ln{n}}{\ln{\ln{n}}}} 4^n \text{(Conlon inequality)}$$

$$R(3, n) = O(\frac{n^2}{\ln {n}})$$

$$\exists \{c_n\}_{n=1}^{\infty} \subset \mathbb{R} \forall n \in \mathbb{N} \forall m \in \mathbb{N} \text{ } c_n \frac{m^{\frac{m+1}{2}}}{{(\ln{m})}^{\frac{n+1}{2} - \frac{1}{n-2}}} \leq R(n, m) \text{(Bohman-Keevash inequality)}$$

$$\exists \{c_n\}_{n=1}^{\infty} \subset \mathbb{R} \forall n \in \mathbb{N} \forall m \in \mathbb{N} \text{ } R(n, m) \leq c_n \frac{m^{n-1}}{{(\ln{m})}^{n-2}} \text{(Ajtai-Komlós-Szemerédi inequality)}$$

$\endgroup$
4
$\begingroup$

Let $d(n)$ be the number of divisors of $n$. Then $\sum_{n=1}^xd(n)$ is asymptotic to $x\log x$. That is to say, $$\lim_{x\to\infty}{1\over x\log x}\sum_{n=1}^xd(n)=1$$ More is known: $$\sum_{n=1}^xd(n)=x\log x+(2\gamma-1)x+O(\sqrt x)$$ where $\gamma$ is Euler's constant.

$\endgroup$
4
$\begingroup$

As the asker pointed out, my last answer was not specific to the natural logarithm. So, here's a slightly more $\ln$-specific example:

$$\ln(2)=\frac1{1+\dfrac1{1+\dfrac{2^2}{1+\dfrac{3^2}{1+\dfrac{4^2}{\ddots}}}}}$$

Which converges quite slowly. Isn't that pretty?

$\endgroup$
4
$\begingroup$

One other that I found interesting:

If we have a unit square: $x,y\in[0,1]^2$ we can say that the average (expected) difference between two points in said square is: $$\int\limits_0^1\int\limits_0^1\int\limits_0^1\int\limits_0^1\sqrt{(x_1-x_2)^2+(y_1-y_2)^2}\,\text{d}x_1\text{d}x_2\text{d}y_1\text{d}y_2$$ Which is turns out is equal to: $$\frac{2+\sqrt{2}+5\ln(\sqrt{2}+1)}{15}$$

Source

$\endgroup$
3
$\begingroup$

The natural logarithm occurs often when analysing sorting and searching algorithms used in computer science. A famous example is the asymptotic formula of the average number of comparisons $Q_n$ of the Quick Sort algorithm. \begin{align*} \color{blue}{Q_n=2n(\ln n + \gamma -2)+2\ln n+2\gamma+1+O\left(\frac{1}{n}\right)} \end{align*}

Volume 3 of Knuth's classic The Art of Computer Programming is titled Sorting and Searching. It presents a wealth of applications of these two fundamental combinatorial themes and one gem is C.A.R. Hoare's Quicksort algorithm.

Quicksort is the standard sorting procedure in UNIX systems and has been cited as we can read in this paper by J.A. Fill as one of the ten algorithms with the greatest influence on the development and practice of science and engineering in the $20$th century.

$\endgroup$
1
$\begingroup$

Related to what you have given yourself but I always found it interesting: $$\int_1^x\frac{dt}t=\ln(x)-\ln(1)=\ln(x)$$ This is perhaps one of the main definitions of $\ln$ though. We also have that: $$\Re\left[\ln(x+jy)\right]=\frac12\ln(x^2+y^2)$$


logs are also often useful in approximating equations, e.g.: $$y=c x^n$$ $$\ln(y)=\ln(c)+n\ln(x)$$ so plotting $\ln(x)$ vs $\ln(y)$ allows us to find values for $c,n$

$\endgroup$
0
$\begingroup$

Given a monotonically decreasing sequence of positive numbers $(a_n)$ with $$\lim_{n\to\infty}na_n=0,$$ does it follow that the series $$\sum_{n=1}^\infty a_n$$ converges?

No, we have the counterexample $$a_n=\frac{1}{n\ln(n)},$$ and this is probably the "most simple" counterexample.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .