I am a bit confused about the proof of Perron's formula. It states that for a Dirichlet series $f(s) = \sum_{n\geq 1} a_n n^{-s}$ and real numbers $c > 0$, $c > \sigma_c$, $x > 0$ we have
$$\sum\limits_{n\leq x}\kern-1pt^{'} a(n) = \frac{1}{2\pi i} \int_{c-i\infty}^{c+i\infty} \frac{f(s) x^s}{s}\,\text{d}s$$ where $\sigma_c$ is the abscissa of convergence of $f$. Now I understand the proof for the case where $c > \sigma_a$ (the abscissa of absolute convergence) perfectly well, but I don't understand how to extend it to $c > \sigma_c$.
Following some hints I found somewhere I tried to consider the integral $$\int_C \frac{f(s)x^s}{s}\,\text{d}s$$ where $C$ denotes a rectangular path with corners $c - iT$ and $c' + iT$ where $c'$ is some sufficiently big number (in particular $c' > \sigma_a$). If the integrals along the horizontal lines of the rectangle vanish for $T\to\infty$, this solves our problem since then the integral $\int_{c-i\infty}^{c+i\infty}$ is equal to the integral $\int_{c'-i\infty}^{c'+i\infty}$. But I don't see why those integrals should vanish.
The best I can come up with is $$\left|\int_{c+iT}^{c'+iT} \frac{f(\sigma+iT)x^{\sigma+iT}}{\sigma+iT}\,\text{d}\sigma \right| \leq (c' - c)\cdot|f(\sigma + iT)|\cdot\max(x^c, x^{c'}) / T$$ but I don't see how that gets me anywhere unless I could also show that $\lim_{T\to\pm\infty} f(\sigma + i T)/T = 0$ uniformly for all $\sigma\in[c,c']$. Is that the case? If so, why?