1
$\begingroup$

I would like to compare the performance of the Gasser Müller estimator with other estimators for estimating the the derivative $m'(x)$ of the regression function $m(x)$.

Let's say we have the following regression model, $$ Y_{(i)}(X_i) = m(X_i) + \varepsilon_i $$

where :

  • $\{X_i\} \sim U([0,1])$
  • $\{\varepsilon_i\} \sim \mathcal{N}(0,1/4)$
  • $m(x) = \sin^3(2\pi x^3)$

I know the Gasser Müller estimator (with a bandwidth $h$ and a Kernel $K$) for the regression function $m(x)$ is, $$ \hat{m}(x) = \sum_{i=1}^{n} Y_i \cdot \int_{s_{i-1}}^{s_i} \frac{1}{h} K \left(\frac{x-u}{h}\right)du $$

where, $$ s_i = \frac{(X_i + X_{i+1})}{2}, \quad i = 1,\dots,n-1 $$

Now, for the simulation we can observe that, $$ \hat{m}(x) = \sum_{i=1}^{n} Y_i \cdot \left[ CDF(K) \left(\frac{x-s_{i-1}}{h}\right) - CDF(K) \left(\frac{x-s_{i}}{h}\right) \right] $$

Now, I want the estimator $\hat{m}'(x)$. We can leverage that the derivative of the CDF is the PDF so that finally, $$ \hat{m}'(x) = \sum_{i=1}^{n} Y_i \cdot \left[K \left(\frac{x-s_{i-1}}{h}\right) - K \left(\frac{x-s_{i}}{h}\right) \right] $$

Does that make sense ?

$\endgroup$

0