I am having a difficult time verifying the following theorem, and hope that someone can lend me a hand.
Holmes, in his book Introduction to Perturbation Methods (Second Edition) states:
"Theorem 1.4: Assume $f(x,\epsilon)$, $\phi(x,\epsilon)$ and $\phi_0(\epsilon)$ are continuous for $ a \leq x \leq b$ and $0 < \epsilon < \epsilon_1$.
(a) If $f \sim \phi$ for $a \leq x \leq b$, and if $|\phi(x,\epsilon)|$ is monotonically decreasing in $\epsilon$, then this asymptotic approximation is uniformly valid for $a \leq x \leq b$."
part (b) is not relevant to this question
When he writes $f\sim\phi$, he implicitly means as $\epsilon \downarrow 0$. Furthermore this means that (at any fixed $x_0 \in [a,b]$) given a $\delta >0$ one can find an $\epsilon_0 > 0$ (generically dependent on $x_0$) such that
$$ 0 < \epsilon < \epsilon_0 \Rightarrow \ |f(x_0,\epsilon) - \phi(x_0,\epsilon) | < \delta |\phi(x_0,\epsilon)|$$
Part (a) of the theorem says that if $|\phi(x,\epsilon)|$ is monotonically decreasing with $\epsilon$ (for all $x \in [a,b]$), then one can find an $\epsilon_*$, independent of $x$, such that the above inequality holds whenever $0<\epsilon<\epsilon_*$ (i.e. the asymptotic approximation is uniform on $[a,b]$).
I've tried showing that $Min\{ \epsilon_0 \}_{x_0 \in [a,b]} > 0$ but keep having a difficult time relating what happens at different points $x$ in a useful way. Any ideas?