Consider $X:=(X_{1},...,X_{n})$ multivariate Gaussian distribution, i.e. $X \sim \mathcal{N}_{n}(\mu, \Sigma).$
Now let
$P_{E_{1}}$ and $P_{E_{1}^{\perp}}$ denote the orthogonal projection onto $E_{1}$ and its orthogonal complement $E_{1}^{\perp}$, respectively.
Is it true that $P_{E_{1}}X$ and $P_{E_{1}^{\perp}}X$ are independent?
My idea:
Consider the matrix $$\overline{P}:=\begin{pmatrix} P_{E_{1}}X\\ P_{E_{1}^{\perp}}X\end{pmatrix}=\begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot X$$
As a linear transformation of multivariate Gaussian, it is once again multivariate Gaussian and its distribution is given as
$$ \overline{P} \sim \mathcal{N}\left(\begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot \mu, \begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot \Sigma\cdot \begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}^{T}\right),$$
and in our case to get independence, we are only interested in the expression
$$\begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot \Sigma\cdot \begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}^{T}=\begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot \Sigma\cdot \begin{pmatrix} P_{E_{1}}^{T}&& P_{E_{1}^{\perp}}^{T}\end{pmatrix}$$
But I am not sure whether this expression will help me get to independence. Any ideas?