1
$\begingroup$

I am using a lens testing interferometer, where I record 4 to 5 interferograms with a 90$^{\circ}$ phase step between consecutive interferograms. In addition to the interferometer, I have also created a Python script, where I simulate the process using computer-generated interferograms and try to make everything as realistic as possible.

An important criterion in my analysis is the data modulation per pixel $\textbf{γ}$. According to the literature, the data modulation for the 5-step Hariharan Phase-Stepping Algorithm that I am using, can be calculated from the 5 recorded interferograms(Malacara, e) as per the formula shown below:

$$\gamma = \frac{3[4(I_4-I_2)^2+(I_1+I_5-2I_3)^2]^\frac{1}{2}}{2(I_1+I_2+2I_3+I_4+I_5)}$$

In my simulations the fringe contrast is a constant 1.0, as expected. However, when I calculate and plot the fringe contrast from real interferograms, I see that the average value is about 0.35 and that there is a visible fringe patters across the plot, meaning that something is modulating the fringe contrast.

My question is: Could you explain possible causes (error sources) that can cause this fringe pattern, the ripples and the overall drop in the fringe visibility?

Cross Section Along the Center of Fringe Visibility Plot

Visibility Plot from Real Interferograms

$\endgroup$

0