1
$\begingroup$

Assume a multilayer with high and low refractive index layers. Now imagine that the high refractive index layers are successively made thinner and thinner. At what point would a single layer no longer be seen as one interface, that would reflect light, and instead be considered as part of the entire structure? I.e., the refractive index would now be the average of both the two materials, and the light would no longer act as if the layers were separate interfaces?

$\endgroup$

2 Answers 2

2
$\begingroup$

The "classical" answer for that cutoff is when the thickness of the film is less than one quarter wavelength at the wavelength of interest.

$\endgroup$
1
$\begingroup$

Thin films are black, see for example thin soap bubbles.

There are two ways to understand this. One is by looking at reflection from surfaces. There is then a $180^\circ$ phase difference between reflection from the front and from the back surface (if it is embedded in the same material). Those two rays will interfere destructively when the thickness is small compared to the wavelength in the material.

But surfaces are a bit of a mathematical abstraction. The better way is to look at reflection as coherent scattering from the constituent atoms of the layer. Then the intensity will clearly go to zero when the number of scatterers decreases. For layers much smaller than the wavelength, the field strength is proportional to thickness, the intensity proportional to thickness squared.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.