4
$\begingroup$

I am having a difficult time verifying the following theorem, and hope that someone can lend me a hand.

Holmes, in his book Introduction to Perturbation Methods (Second Edition) states:

"Theorem 1.4: Assume $f(x,\epsilon)$, $\phi(x,\epsilon)$ and $\phi_0(\epsilon)$ are continuous for $ a \leq x \leq b$ and $0 < \epsilon < \epsilon_1$.

(a) If $f \sim \phi$ for $a \leq x \leq b$, and if $|\phi(x,\epsilon)|$ is monotonically decreasing in $\epsilon$, then this asymptotic approximation is uniformly valid for $a \leq x \leq b$."

part (b) is not relevant to this question

When he writes $f\sim\phi$, he implicitly means as $\epsilon \downarrow 0$. Furthermore this means that (at any fixed $x_0 \in [a,b]$) given a $\delta >0$ one can find an $\epsilon_0 > 0$ (generically dependent on $x_0$) such that

$$ 0 < \epsilon < \epsilon_0 \Rightarrow \ |f(x_0,\epsilon) - \phi(x_0,\epsilon) | < \delta |\phi(x_0,\epsilon)|$$

Part (a) of the theorem says that if $|\phi(x,\epsilon)|$ is monotonically decreasing with $\epsilon$ (for all $x \in [a,b]$), then one can find an $\epsilon_*$, independent of $x$, such that the above inequality holds whenever $0<\epsilon<\epsilon_*$ (i.e. the asymptotic approximation is uniform on $[a,b]$).

I've tried showing that $Min\{ \epsilon_0 \}_{x_0 \in [a,b]} > 0$ but keep having a difficult time relating what happens at different points $x$ in a useful way. Any ideas?

$\endgroup$

1 Answer 1

0
$\begingroup$

Take $x_1 , x_2 \in [a,b]$, then given $\delta>0$ you know that there exist $\epsilon_0(x)$ such that \begin{eqnarray} |f(x_1,\epsilon) - \phi(x_1,\epsilon)| \leq \delta |\phi(x_1,\epsilon)| \quad \text{for} \quad 0 < \epsilon < \epsilon_0(x_1), \\ |f(x_2,\epsilon) - \phi(x_2,\epsilon)| \leq \delta |\phi(x_2,\epsilon)| \quad \text{for} \quad 0 < \epsilon < \epsilon_0(x_2). \end{eqnarray} Edited: I understand the monotonicity of $\phi$ to be as follows: \begin{equation} \phi(x,\epsilon_1) \leq \phi(x,\epsilon_2) \quad\text{if}\quad \epsilon_1 < \epsilon_2, \end{equation} so $\phi$ decreases as a function of $\epsilon$ as $\epsilon$ decreases -- which is rather the opposite of what 'monotonically decreasing in $\epsilon$' usually means.

The monotonicity of $\phi$ in $\epsilon$ can now be used in two ways: first, for every $x_i$, you can estimate $|\phi(x_i,\epsilon)|$ by its value at the rightmost point of the $\epsilon$-interval: \begin{equation} |\phi(x_i,\epsilon)| \leq |\phi(x_i,\epsilon_0(x_i))|. \end{equation} Second, without loss of generality, you can assume that $\epsilon_0(x_2) > \epsilon_0(x_1)$, such that $(0,\epsilon_0(x_1)) \subset (0,\epsilon_0(x_2))$. Since $\phi$ is monotonically decreasing in $\epsilon$, you can therefore estimate \begin{equation} |\phi(x_1,\epsilon_0(x_1))| \leq |\phi(x_1,\epsilon_0(x_2))|. \end{equation}

Edited: Therefore, to find a uniform upper bound for the $\epsilon$-interval, you should look at \begin{equation} \min_{a\leq x \leq b} \epsilon_0(x). \end{equation}

$\endgroup$
6
  • $\begingroup$ Doesn't monotonically decreasing imply that if $\epsilon < \epsilon_0(x_i)$ then $|\phi(x_i,\epsilon)| \geq |\phi(x_i,\epsilon_0(x_i))|$ instead of what you wrote in your second equation? And similarly for your third equation (given your ordering $\epsilon_0(x_2) > \epsilon_0(x_1)$)? I am starting to believe that what both you and Holmes mean by $|\phi(x_1,\epsilon)|$ being a monotonically decreasing function of $\epsilon$ is that it decreases as $\epsilon \to 0$. Is that right? $\endgroup$
    – alephbeta
    Commented Oct 6, 2015 at 20:38
  • $\begingroup$ Ah! I now understand the confusion. Yes, I didn't even think of that, but that's what I understand is going on -- that $\phi$ decreases as $\epsilon \to 0$. Which is rather the opposite of the usual definition of `$f$ is decreasing in $x$', which means that $f$ decreases as $x$ increases. Thanks for clearing it up! $\endgroup$ Commented Oct 7, 2015 at 10:45
  • $\begingroup$ Ok, that makes more sense, but I still can't get all the way through the proof (btw, thank you for your answer and your patience). So, I agree with your equations two and three given decreasing monotonicity of $|\phi(x,\epsilon)|$ as $\epsilon \downarrow 0$. Now, say I am not interested in finding the biggest interval in $\epsilon$ for which the approximation is uniform, rather just any such interval. Following the spirit of you answer, I should be able to prove that if $\epsilon_0(x_2) > \epsilon_0(x_1)$ then I could use $\epsilon_0(x_2)$ in place of $\epsilon_0(x_1)$ for the $x_1$ estimate? $\endgroup$
    – alephbeta
    Commented Oct 7, 2015 at 15:14
  • $\begingroup$ As far as I can see, you can obtain a uniform bound by defining $\epsilon_* = \min_{a\leq x \leq b} \epsilon_0(x)$, which is precisely the opposite of my suggestion in the last part of the answer. I guess I was thrown off by the 'reverted' monotonicity of $\phi$ as well. I'll edit the answer. $\endgroup$ Commented Oct 7, 2015 at 15:47
  • $\begingroup$ But then we are back to having to show that $min \{\epsilon_0(x)\}_{a \leq x \leq b}$ is not equal to zero. $\endgroup$
    – alephbeta
    Commented Oct 7, 2015 at 16:00

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .