14
$\begingroup$

I am looking to the windowed roof of my building, and I notice that the sky, which has few white clouds, sometimes looks completely gray, as if there was a huge cover of gray clouds, but there are not.

By observing it and waiting for the change to happen, I notice it takes a matter of seconds : when the sun shines directly on the building (but not directly on me), the blue sky turns gray. I wonder if this is just an optical illusion because my vision of colour could be relative to its environment, or if there is serious glass/sun/refraction effect or something. I haven't found anything on that matter on the internet or on Stack Exchange, so I'm asking you what you think about it.

Additional information : The top window is a very peculiar glass, it's basically a strong glass window (because it's almost horizontal, it has to resist hits) and looks like it is covered by a layer of opaque black, with big holes (surface of holes is greater than surface of black, from a distance you don't see the holes). Also, I'm taking the pictures from a windowed office, you see the sky through two glass windows, but the one closest to me looks basic and I didn't notice a difference when looking through one layer. You will also notice there is a blue light, some neon-like directed at the roof, but during the day its effect is negligible.

Here are the pictures, first the sky through top window without sun hitting the window, then with the sun (The colours/ exposition differences don't seem to make a great difference compared to my naked eye observation. The pictures were taken only a few seconds apart, you can see the same cloud in both pictures. The gray sky isn't the most grayish I've observed, it gets even farther away from blue sometimes) :

Sky through top window without sun hitting the window

Sky through top window with sun hitting the window

$\endgroup$
3
  • 1
    $\begingroup$ Interesting indeed! My initial response is that the direct sunlight causes the glass's own color(or that of the overlayer you describe) to 'override' the relatively weak intensity coming from the blue sky itself, thus acting like a bright but gray overlay to the scene. $\endgroup$ Commented Aug 11, 2014 at 13:41
  • $\begingroup$ @CarlWitthoft It's the most plausible answer to me, but the difference is so astonishing it seems wierd to think that the gray is just a "darker blue". Could the difference only be intensity ? now that the sky is shining at me through the window, I can see the windows are dirty, something like a dried puddle of oil (unlikely oil of course). If it is pollution chemicals, could it play some role in the process I'm trying to understand ? I'll look at the wavelengths of sky blue colour and do some research. $\endgroup$
    – PhilDenfer
    Commented Aug 11, 2014 at 14:02
  • $\begingroup$ @CarlWitthoft Turns out you are demonstrably correct. $\endgroup$
    – alemi
    Commented Aug 12, 2014 at 0:23

2 Answers 2

19
$\begingroup$

Let's try to validate and quantify the conjecture first raised by Carl Witthoft in a comment to the question, which is basically that the sky only appears less blue in the second picture because a lot more light is scattering off of the windows towards your camera.

If this is true, we ought to be able to see it. The first thing to do is convert the pictures from the relatively useless RGB colorspace to the much more useful XYZ colorspace which is built on a model of the actual receptors in the human eye. The $Y$ coordinate corresponds to the perceived luminance of the image (i.e. the average human response across the visible spectrum) and the $Z$ coordinate corresponds to our blue receptor response. The $X$ coordinate is set to be pick up the slack and doesn't necessarily have a clear physical interpretation. See here the responses across the visible spectrum: (from wikipedia):

XYZ color coordinates in terms of spectral response

So, that is the first thing I did. I obtained:

Decomposition of two images

Above you will see the two original pictures, as well as their $Y$ and $Z$ values. Here we can clearly see that the total illumination ($Y$) in the Gray picture has gone up, and the blue content of the image ($Z$) has gone up as well.

Let's try to take a closer look. To do that I will next look at a histogram of the $Y$ and $Z$ values in the images:

Histogram of $X$ and $Y$ response

Looking at this histogram of values, we can clearly see that at the middle levels (near ~ 0.5) both of the images have a blue hump. Let's assume that is the sky (we'll check in a second). But notice also that if anything that blue hump has shifted up a bit in activation. Sorta nearby the blue hump is a hump in the luminance ($Y$), which appears to move a lot. But there is a lot going on in the image, and if the conjecture is right and there is more light coming in through the windows, we would expect everything in the picture to be brighter, including the columns and wall. So, we need to try to filter the sky, so let's make a cut on the image given by those humps in the blue. I've shown my choices for the cuts as the vertical dashed lines in the image. Applying that cut to the original image we obtain:

Attempting to find just the sky

Absolutely wonderful! We've just developed a nearly perfect sky filter. Now that we know which pixels correspond to the sky, we can look again at our histograms, but this time only for "sky" pixels.

Histograms of $X$ and $Y$ for sky pixels

And now it would appear as though there is no denying Carl Witthoft's explanation, the sky appears less blue, in the "Gray Sky" picture, not because any of the blue has gone away (in fact if anything there is more blue content in it) but because there is just so much more light coming from those points beyond just the blue, and so it doesn't look blue anymore. For completeness, let's look at the histograms in the RGB channels of just the sky pixels:

RGB Histograms of Sky

Here we can clearly see that it is not that the blue went away, we just have a heck of a lot more red and green coming from the windows now.

But why does it look so much less blue, when the values of the red and green channels are still smaller than the blue?

That is entirely an effect of human perception. We are a lot less sensitive to blue light than we are to green. If you take a look at the plot at the top of this answer again, remember that the $Y$ curve was chosen to be the perceptual sensitivity of human subjects across the visible spectrum. Notice how little it overlaps with blue.

In fact, a common formula people use to convert images to grey scale (that is worse than the XYZ transformation, but easy to do) is:

$$ L = 0.21 R + 0.72 G + 0.07 B $$

This demonstrates the issue with just three numbers. Roughly 72% of what we perceive as brightness comes from the green channel, 21% comes from the red, and only 7% comes from the blue. This is why, when the sun shines on those windows in your building, even though there is more blue light coming in, and the blue components still dominate the other colors, it suddenly looks very drab indeed.

All of the code used to make these figures is available as an ipython notebook here.

$\endgroup$
2
  • 1
    $\begingroup$ Amazing answer, that's what I was looking for, as a computer science student, I knew I'd have to use different color spaces than the RGB one, but I lacked the physics tools about understanding perception. Your analysis really interests me, and I'll certainly use that kind of approach on other image processing problems, thanks a lot for providing the code ! $\endgroup$
    – PhilDenfer
    Commented Aug 12, 2014 at 9:03
  • $\begingroup$ I wonder whether any of this relates to automatic white balance (AWB). I ask this because if you look at the wall below the window at the far end (bottom of the picture), it has a definite (quite strong) blue hue - while in the lower picture it's much more neutral tone. This suggests to me that the picture was "too blue" initially. I'm not discounting the presence of more scattered light, but AWB can really mess with the color "perception" of a digital camera. I would want to see the RAW images before drawing the conclusions you reach. $\endgroup$
    – Floris
    Commented Nov 13, 2014 at 15:45
5
$\begingroup$

There is a favorite question on graduate schools' qualifying exams in physics,

Why the sky is blue?

The answer is the Rayleigh scattering. Shorter wavelength photons are more likely to change the direction in the atmosphere which is why the bluer, shorter-wavelength light is overrepresented in the light coming from random directions of the sky.

I believe that the same reason "why the sky is blue" is also the reason why "the sky under certain circumstances is not so blue". When the Sun is shining directly on the glass, but not at you, the photons may scatter of the dirt and impurities on the glass. Unlike the air, these impurities are able to change the direction of all photons.

So if the dirt on the glass isn't directly illuminated by the Sun, the photons from generic places of the sky only arrive to your eye if they originated from the solar photons scattered by the atmosphere, and these photons tend to be blue. However, if the glass is directly illuminated, you are receiving lots of photons from the Sun that change their direction while hitting the dirt on the glass, and these are not shifted to the blue end of the spectrum because the scattering off the dirt isn't Rayleigh scattering; it is more color blind. Therefore the color seen in the generic directions becomes greyer.

Note that I am pretty much saying that the right question should have been the opposite to yours: why the sky is blue in the first situation. It's the "default state" for light coming from some directions to be color-neutral i.e. "grey". This occurs when the photons' directions are being changed pretty much independently of their color or wavelength, and this condition is obeyed when the scattering off the impurities on the glass is the dominant source of light from a direction. On the other hand, it's "extraordinary" for light not to be color-neutral i.e. to be blue, and that's why it's still the blue sky, and not the grey sky, that deserves a "special" explanation. It's the usual explanation based on the Rayleigh scattering! In this sense, the answer to your question is simply that the "special processes" that make the sky unusual i.e. blue are not dominant in the second situation.

$\endgroup$
2
  • $\begingroup$ Interesting answer, I did read about Rayleigh scattering before your post, but to me it only explained the blue color of the sky as seen through the atmosphere. I do not understand the rest, even though I see what you mean as "the scattering off the dirt isn't Rayleigh scattering; it is more color blind". But I think your conclusion is what is verified in alemi's answer, right ? Blue color, then colorblind filter, then there is less blue, which appears gray but our peception tricks us. If it is what you meant, then I still have difficulties understanding why the filter is stronger when sunlit. $\endgroup$
    – PhilDenfer
    Commented Aug 12, 2014 at 9:12
  • $\begingroup$ Hi, I don't think it's right to talk about a "filter". The grey light you are seeing is not obtained by filtering the blue light from the sky - it wasn't reflected by droplets at random points of the sky first. Instead, the grey light is scattered white light that goes directly from the Sun, and changes the direction in the dirt on the glass. I tried to point out that your whole way of looking at the question is upside down from a physics viewpoint. The normal color is grey - the abnormal color that needs an extra explanation is blue. $\endgroup$ Commented Aug 12, 2014 at 9:31

Not the answer you're looking for? Browse other questions tagged or ask your own question.