Recall that for an $\mathcal{N}(\mu, \sigma^2)$ random variable, the moment generating function of it is
$$M(t) = \exp\left(\mu t + \frac{1}{2}\sigma^2t^2\right). \tag{1}$$
By condition, $X + Y \sim \mathcal{N}(2\mu, 2\sigma^2)$ and $X - Y \sim \mathcal{N}(0, 2\sigma^2)$. Therefore by $(1)$, we have:
$$M_{X + Y}(t) = \exp\left(2\mu t + \sigma^2 t^2\right), \; M_{X - Y}(t) = \exp\left(\sigma^2 t^2\right).$$
On the other hand, as a bivariate random vector $(X + Y, X - Y)$, its MGF can be computed by definition as follows:
\begin{align}
& M_{(X + Y, X - Y)}(t_1, t_2) \\
= & E[\exp(t_1(X + Y) + t_2(X - Y))] \\
= & E\left\{\exp[(t_1 + t_2)X] \times \exp[(t_1 - t_2)Y]\right\} \\
= & E\left\{\exp[(t_1 + t_2)X] \right\}\times E\left\{\exp[(t_1 - t_2)Y]\right\} \quad \text{by independence of $X$ and $Y$.}\\
= & M_X(t_1 + t_2) M_Y(t_1 - t_2) \\
= & \exp\left(\mu(t_1 + t_2) + \frac{1}{2}\sigma^2(t_1 + t_2)^2\right)\exp\left(\mu(t_1 - t_2) + \frac{1}{2}\sigma^2(t_1 - t_2)^2\right) \\
= & \exp\left(2\mu t_1 + \sigma^2 t_1^2\right)\exp\left(\sigma^2 t_2^2\right) \\
= & M_{X + Y}(t_1) M_{X - Y}(t_2).
\end{align}
Hence $X + Y$ and $X - Y$ are independent.