Let $Y$, $X_1$ and $X_2$ be three continuous real random variable with $f(x_1, x_2) >0$ everywhere on $R^2$ and denote by $g(x_1, x_2) = E[Y|X_1 = x_1, X_2 = x_2]$. Then $g(0,0) = E[Y|X_1 = 0, X_2 = 0]$ is well defined.
Suppose we apply a transformation $h(\cdot)$ to $X_1$ and $X_2$ such that $h(X_1 = x_1, X_2 = 0) = 0$ iff $X_1 = 0$ and $X_2 = 0$. Suppose further that the resulting density of $Z = h(X_1, X_2)$, denote it by $f_z(z)$, is such that $f_z(0) = 0$ and there exists a neighborhood $N_0$ of $z = 0$ where $f_z(z) >0$ $\forall z \in N_0$.
My question is, is it true that $E[Y| Z = 0] = E[Y|X_1 = 0, X_2 = 0] = g(0,0)$? My problem is that $E[Y| Z = 0]$ is not well defined because $f_z(0)$ is zero, so the latter equality is meaningless, but maybe there is a way to approximate $g(0,0)$ by exploiting the fact that $E[Y|Z = z]$ is defined in a neighborhood of $z = 0$? Maybe imposing smoothness conditions on $g(x_1, x_2)$?
I encountered this problem studying local polynomial estimators with transformed variables, but also could apply to estimation of regression functions conditional on a univariate variable with sparse density.