I have found a theorem (see below) in two papers an I try to figure how it could be proved. The result seems to be intuitive, but I'm not able to prove it in a rigorous way.
Assumptions:
Consider a continuous stochastic process $(X_t)$ together with a Brownian motion $(B_t)$. The two processes are assumed to be independent. Their natural filtrations are denoted by $(\mathcal F_t^X)$ and $(\mathcal F_t^B)$
Theorem:
Under the above assumption, the following relation holds: $$\mathbb E\left(\exp\left(\int_0^tX_sdB_s\right) \mid \mathcal F_t^X\right)=\exp\left(\frac12\int_0^tX_s^2ds\right)$$
Proof 1
I have found a proof in Piterbarg page 1 (see here for the main article). The proof is based on computing the expectation path by path. That is, they assume that the path $(X_t)$ is known and compute the integral.
However, I believe that this argument only works when the number of paths $(X_t)$ is finite, using this result.
Proof 2
In Sin this result is proved (in particular) when $(X_t)$ is given by the dynamic $dX_t=X_td\tilde B_t$, where $(\tilde B_t)$ is a Brownian motion independent of $(B_t)$. They define the processes $$S_t=\exp\left(\int_0^tX_sdB_s-\frac12\int_0^tX_s^2ds\right)\quad\text{and}\quad S_t^{(n)}=S_{t\wedge\tau_n},$$ where we define the following stopping time $$\tau_n=\inf\left\{t>0:\int_0^t X_s^2ds\geq n\right\}.$$ In the paper, they use the following step, based on the Lebesgue dominated convergence: $$\mathbb E(\lim_{n\to\infty}S_T^{(n)})=\lim_{n\to\infty}\mathbb E(S_T^{(n)})$$
However I don't directly see how the Lebesgue dominated convergence can be applied in this case.
Question:
Can someone explain me why this results holds or has a good reference book about it?
Attempt of a proof:
Following this post, one can prove the result for simple processes, that is when $$X_s =\sum_{j=1}^n 1_{[t_{j-1},t_j)} \xi_j.$$ Now, we can take a sequence of processes $(X_s^n)$ converging to $(X_s)$ (with respect to the sup norm): $$(X_s^n)\to(X_s)\quad(n\to\infty)$$ Using the Itô isometry, we obtain the following convergence in probability: $$\int_0^tX_s^ndB_s\to\int_0^tX_sdB_s\quad (n\to\infty)$$ For each subsequence, we can find a subsubsequence (wlog $n_k$) such that the following convergence holds almost surely: $$\int_0^tX_s^{n_k}dB_s\to\int_0^tX_sdB_s\quad (k\to\infty)$$ If the Lebesgue dominated convergence theorem did hold, then we could conclude that $$\mathbb E\left(\exp\left(\int_0^tX_sdB_s\right) \mid \mathcal F_t^X\right)=\exp\left(\frac12\int_0^tX_s^2ds\right).$$
However I can't find an integrable majoring function.
Remark: What I basically did above is use the following property, which is true when $X$ is a discrete-valued random variable (see here).
The conditional expectation of $Y$ given $X$ is given by $\mathbb E(Y \mid \sigma(X))=f(X)$ where $f(x)=E(Y \mid X=x)$.
This computation seems to work above, by conditioning on the process $(X_t)$, even though it is not mathematically correct. In my view, this way of reasoning would fail in the following case if we say that $$\mathbb E\left(\exp\left(\int_0^tX_sdB_s\right) \mid \mathcal F_t^B\right)=f((B_s)_{0<s<t})$$ where $$f((b_s)_{0<s<t})=\mathbb E\left(\exp\left(\int_0^tX_sdB_s\right) \mid(B_s)_{0<s<t}=(b_s)_{0<s<t}\right)=\mathbb E\left(\exp\left(\int_0^tX_sdb_s\right)\right)$$
Namely, we would (almost surely) integrate with respect to a realization of the Brownian path $(b_s)_{0<s<t}$ which isn't of locally bounded variation. Thus the function $f$ is not well defined.