Assume a linearly polarized light passing through a retardation crystal and a quarterwave plate and the angle between the linear light and the principal axis is 45 degrees. The question is to calculate the intensity difference between $x$ and $y$ axies of the transmitted light. Here are two ways to calculate it.
Assume the light is originally along $x$ axis, so its Jones vector is $\begin{pmatrix} 1\\ 0 \end{pmatrix}$. Then rotate the retardation matrix of the crystal $W$ and the quaterwave plate $Q$ of 45 degrees so that they are in the same bases, namely $W' = R(-\theta)WR(\theta)$ and $Q' = R(-\theta)QR(\theta)$, where $W = \begin{pmatrix} e^{-i\Gamma/2} & 0\\ 0& e^{i\Gamma/2} \end{pmatrix}$ and $R(\theta) = \begin{pmatrix} cos\theta & sin\theta\\ -sin\theta& cos\theta \end{pmatrix}$. Then the final vector is $Q'W'\begin{pmatrix} 1\\ 0 \end{pmatrix}$ and the $|E_x|^2 - |E_y|^2$ is calculated to be proportional to $sin(\Gamma)$ which is the theoretical basis of electro-optic sampling.
Keep $W$ and $Q$ the same as they are consistent with the lab axis but set the linearly light as $\begin{pmatrix} 1\\ 1 \end{pmatrix}$ which means it is 45 degrees polarized. Now we directly multiply them because they share the same bases, namely the final vector is $QW\begin{pmatrix} 1\\ 1 \end{pmatrix}$ but now the$|E_x|^2 - |E_y|^2$ is calculated to be $0$.
What's wrong?