Investigating:
$$\epsilon(n)=\frac{(\pi -3) e^{2 \pi n}}{24 \pi }-\sum _{k=1}^n \sigma(k) e^{2 \pi (n-k)}$$
where $\sigma(n)$ is a divisors sum of $n$.
Using long calculations (can not share here because of length) discovered that $\epsilon(n) = O( \ln \ln n)$, but experiment shows that this is not true. This works fine for small $n$-s up to $1000$.
In order to show how experiment breaks this look at the below values of $\epsilon(n)$:
For the $n=1..10$ the $\epsilon(n)$ is ok: $$\{0.00561632,0.00749422,0.0130931,0.0112466,0.0224373,0.0149919,0.0280571,0.0243396,0.033656,0.022507\}$$
For the $n=100..110$ the $\epsilon(n)$ still looks ok it is: $$\{0.191233,0.403732,0.194948,0.392834,0.359115,0.302904,0.202661,0.523269,0.206173,0.403899,0.284717\}$$
But for $n=1000..1010$ the $\epsilon(n)$ is starting to grow quickly: $$\{2.51688,3.76854,2.023,3.29987,3.05295,2.82736,2.02809,6.02417,1.89253,3.43335,2.53182\}$$
And for $n=2000..2010$ the $\epsilon(n)$ is surely growing quicker than $\ln \ln n$: $$\{5.39231,7.53655,3.75878,8.79288,4.51559,6.06069,5.45119,7.06731,4.48775,9.15004,3.76962\}$$
My estimation was inspired by Divisors Sum Related Interesting Approximate Relation and esepcially on a fact (provided by @GerryMyerson) that: $$\sum_{k=1}\sigma(k)e^{-2\pi k}=(1/24)-(1/(8\pi))$$ using a long calculus, which is really impossible to bring to here. And also would like to get the fresh view on this.
So need a help in estimating of $\epsilon(n)$.
According to @Gary's comment adding $\epsilon(n)$ values for first $300$ numbers, to motivate my interest in finding better estimation than $\epsilon(n)=\mathcal{O}(n\log \log n)$ as experimentally it looks like that should be something better:
$$\{0.0056,0.0075,0.013,0.011,0.022,0.015,0.028,0.024,0.034,0.023,0.052,0.026,0.045,0.045,0.058,0.034,0.073,0.037,0.079,0.060,0.067,0.045,0.11,0.058,0.079,0.075,0.10,0.056,0.13,0.060,0.12,0.090,0.10,0.090,0.17,0.071,0.11,0.10,0.17,0.079,0.18,0.082,0.16,0.15,0.13,0.090,0.23,0.11,0.17,0.13,0.18,0.10,0.22,0.13,0.22,0.15,0.17,0.11,0.31,0.12,0.18,0.19,0.24,0.16,0.27,0.13,0.24,0.18,0.27,0.14,0.36,0.14,0.21,0.23,0.26,0.18,0.31,0.15,0.35,0.23,0.24,0.16,0.42,0.20,0.25,0.22,0.34,0.17,0.44,0.21,0.31,0.24,0.27,0.22,0.47,0.18,0.32,0.29,0.41,0.19,0.40,0.19,0.39,0.36,0.30,0.20,0.52,0.21,0.40,0.28,0.46,0.21,0.45,0.27,0.39,0.34,0.34,0.27,0.67,0.25,0.35,0.31,0.42,0.29,0.58,0.24,0.48,0.33,0.47,0.25,0.63,0.30,0.38,0.45,0.50,0.26,0.54,0.26,0.63,0.36,0.40,0.32,0.75,0.34,0.42,0.43,0.50,0.28,0.70,0.28,0.56,0.44,0.54,0.36,0.73,0.30,0.45,0.40,0.71,0.36,0.68,0.31,0.55,0.54,0.47,0.32,0.90,0.34,0.61,0.49,0.58,0.33,0.67,0.46,0.70,0.45,0.50,0.34,1.0,0.34,0.63,0.46,0.67,0.43,0.72,0.40,0.63,0.60,0.67,0.36,0.95,0.36,0.55,0.63,0.75,0.37,0.87,0.38,0.87,0.51,0.57,0.45,0.94,0.47,0.58,0.58,0.81,0.45,1.1,0.40,0.71,0.54,0.61,0.50,1.1,0.48,0.62,0.55,0.94,0.47,0.85,0.42,0.94,0.75,0.64,0.43,1.0,0.43,0.81,0.72,0.84,0.44,1.0,0.54,0.79,0.60,0.81,0.45,1.4,0.45,0.75,0.68,0.81,0.64,0.94,0.52,0.90,0.63,0.87,0.47,1.4,0.54,0.72,0.81,0.96,0.48,0.99,0.57,1.1,0.73,0.74,0.50,1.3,0.61,0.90,0.67,0.89,0.51,1.3,0.51,1.0,0.84,0.77,0.70,1.3,0.52,0.79,0.78,1.3,0.53,1.1,0.53,0.94,0.90,0.94,0.63,1.5,0.58,1.0,0.73,0.97,0.55,1.3,0.67,1.1,0.90,0.84,0.63,1.6,0.66\}$$