The wavefront will emerge from the slab tilted at an angle. The question asks about deflection of the wave, and the wavefront is the locus of all points in phase. In the emergent wave, clearly the ray leaving from the top will be out of phase with the ray leaving from the bottom - note that the optical path length travelled near the top of the slab (AE) would be more than that of a ray near the bottom (BC), if we just consider how much they travelled within the slab.
By finding the extra path travelled (CD) outside the slab (to make up for the lost phase), the question can be solved.
Note that due to the variable refractive index (vertically), the ray will be deflected. However, we can assume an infinitesimally thin horizontal strip (and here, ray optics breaks down) of the slab to have uniform refractive index. The wave nature of light can be exploited here, by considering point sources at A and B according to Huygens' principle, and oscillations of vectors in this strip. By analysing the optical path travelled by the wave in that thin horizontal strip of medium (along AE and BC) and further ahead, the situation becomes clearer. Ultimately, the change in phase is what decides that the ray has deflected, when we look at it in air.
Notes:
- As a previous user explained, the behaviour is like that of a prism.
- The extra path is denoted by the solid lineCD in the diagram.
- Focus on treating light as a wave rather than a ray here.
I would love to hear from more experienced users about polarization in this case.