Let $f,g$ be convex functions on $[0,\infty)$ such that $\lim_{x\to\infty}\frac{f(x)}{g(x)} = 1$ and $\lim_{x\to +\infty} g(x) = +\infty$.
Is it always true that $\lim_{x\to \infty} \frac{f(x+1)-f(x)}{g(x+1)-g(x)} = 1$?
I can prove it when $g(x)=x$ and $g(x) = x^2$.
Edit. It is actually not so hard to show it works more generally when $g(x) = x^\alpha$ for any $\alpha \geq 1$.
The question can also be asked with series as a partial converse to the Stolz-Cesaro Theorem:
Let $a$ and $b$ be increasing sequences such that and $\displaystyle\lim_{n\to\infty} \sum_{k=1}^n b_k = +\infty$.
Does $\displaystyle\lim_{n\to\infty} \dfrac{\sum_{k=1}^n a_k}{\sum_{k=1}^n b_k} = 1$ imply $\displaystyle\lim_{n\to\infty} \dfrac{a_n}{b_n} = 1$?