I am currently working through the following problem for my optics class. The setup as shown below is composed of a double slit sandwiched between two convergent lenses L1 and L2. A point source is placed on the object focal plane of L1. And a screen is placed on the L2's image focal plane in order to see the interference pattern created by the two slits. The medium through which the light propagates is air, and I'm approximating $n \approx 1$.
First, with the point source directly on the focal point F1.
My understanding is that due to the geometry of the lens, the spherical wavefronts created by the point source are deformed into plane waves. I drew the wavefronts in blue and the direction of propagation as the red rays.
Each of the two small slits then acts as a point source due to diffraction. These two spherical waves interfere. Any two parallel rays emerging from the two secondary sources S1 and S2 will intersect and interfere infinitely far away, and the lens allows us to 'project' this interference onto the focal plane.
For each point P on the screen, there are two parallel rays that will interfere at infinity. Finding the path difference between the two rays at P is no different from when we consider a screen very far away, except I suppose placing a lens in front is equivalent to taking the limit where the viewing screen is infinitely far away.
Since we have two sources interfering, the intensity at the point P is:
$$I(P) = 2I_0(1 + \cos(\Delta \phi))$$
Where $\Delta \phi(P)$ is the phase difference between the two waves at the point P, and it is caused by the difference in path length $\delta_{2, 1}$ between ray 1 (R1) and ray 2 (R2). The optical path difference is $n\cdot\delta_{2, 1} \approx \delta_{2, 1}$.
$$\Delta \phi = \frac{2\pi}{\lambda}\delta_{2, 1} \qquad \delta_{2, 1} = a\sin(\theta)$$
This I understand, I can clearly see that R1 travels a shorter distance to get to P, especially when I consider that we are taking the limit for when the screen is infinitely far away from the slits, hence the phase difference. Where I get confused is when the point source is moved down vertically by a distance $b$.
Point source on the object focal plane of L1, a distance $b$ under F1
Once again the spherical wave is deformed into a plane wave, I traced some characteristic light rays to determine the direction of propagation of the resulting plane wave.
The wavefront is perpendicular to the direction of propagation, and it is also the set of equi-phase points of our plane wave. This time the wavefront is oriented at an angle $\theta'$ away from L1, which supposedly creates an additional path difference $\delta_{2,1}'$ between R1 (still the top ray) and R2 (bottom ray).
However, for the life of me, I just don't grasp how this path difference is created, or whether it is R1 or R2 that is 'behind' i.e. accumulating the additional phase difference.
I'm told that R1 is ahead, because R2 has to travel an additional distance $\delta_{2,1}$ from the surface of L1. And so these two path differences add such that:
$$\Delta \phi = \frac{2\pi}{\lambda}(\delta_{2, 1}+\delta_{2, 1}')$$
But then surely by that logic R2 would arrive at S2 before R1 arrives at S1, and the phase difference would cancel out - we can find $\delta_{2,1}'$ geometrically on the other side as well.
I fail to see why there is an additional component to the phase difference. At which point between the source, the lens L1 and the double slits do the two light rays go out of phase? It doesn't seem to me that there is any additional distance travelled by any of the two rays.
I've looked at this problem for hours, and at this point I'm not even confident I know what the light rays even represent now and how they relate to the phase of the underlying wave. How does the phase vary along these light rays?