The first question I have is: For conditionally independent (given $Z$), random variables $X$ and $Y$, does $\mathbb E [X | Y,Z] = \mathbb E [X | Z]$?
And I also wanted to know if when $X$ is independent of $Y$ and $X$ is independent of $Z$, does $\mathbb E [X | Y,Z] = \mathbb E [X]$? What about if $Y$ and $Z$ were also independent (so all $3$ random variables are pairwise independent)?
For the first question, I know that if $\sigma(Y)$ was independent of the smallest sigma algebra containing $\sigma(X)$ and $\sigma(Z)$ then it would be true, but conditional independence doesn't imply this. I still think it is true but not sure how to show or prove it. I showed it using the expectation equation with probability densities and that $p(x|y,z) = \frac{p(xy|z)}{p(y|z)}$. But I am not sure if this is the correct.
\begin{equation} \mathbb E [X | Y,Z] = \int x p(x|y,z)dx = \int x\frac{p(xy|z)}{p(y|z)}dx \\\text{ using conditional independence } = \int x\frac{p(x|z)p(y|z)}{p(y|z)}dx = \int x p(x|z)dx = \mathbb E [X | Z] \end{equation}
For the second question, I don't believe it is true since the smallest sigma algebra containing $\sigma(Y)$ and $\sigma(Z)$ contains more information than each of them alone. And I'm not sure about the expression when $Y$ and $Z$ are also independent.
Thanks in advance for the help!