3
\$\begingroup\$

The color filter was said to be not quite useful in the digital photography. A light pollution filter, i.e. a broad band filter, was proven to be useful in photography in reducing the light pollution in the imaging.

However, the color filter also reduced the intensity in certain range of spectrum, the reason why it had color. In compare to the post rendering, the Photoshop shifted the RGB channels according to the intensity but not with respect to the wavelength. Thus, there might be a difference in using the color filter and mimic the effect in the Photoshop.

Could you provide some example and personal expertise of how color filter and light pollution filter in DSLR might work differently than post rendering the image through the Photoshop?

\$\endgroup\$
0

4 Answers 4

6
\$\begingroup\$

No filter shifts any wavelength of light. Filters attenuate some wavelengths more than others. Filters which attenuate along a curve with a complex shape can often be useful because they save a lot of time in postprocessing, where doing complex color corrections by hand can be rather tedious.

Some filters can't be simulated using postprocessing. Polarizers, for instance, affect which light is allowed through and which are not based on the polarity of each photon's vibration. Since digital cameras only record the energy of each photon striking the sensor, the information in a digital image file contains no information about the orientation of the vibration of the photon before it struck the sensor. Thus one can not create the effect of a polarizer after the image is captured. Monochrome sensors don't record color information, so color filters must be used at the time of capture if one wishes to darken one color in the scene over another.

Light pollution filters are useful when shooting the night sky because they reduce the amount of light in the wavelengths most often making up light pollution by a fairly significant amount. This allows one to increase exposure, either via a wider aperture or more frequently via a longer exposure, to capture more of the light that is not heavily reduced by the filter. The full effect of a light pollution filter before the light hitting the sensor allows better exposure of the light the filter doesn't heavily reduce without having the undesirable wavelengths blowing out or dominating the image.

For related questions here at Photography SE, please see:

Does black and white film have any advantage over black and white effects in digital?
Why is it that when the green channel clips, it turns into blue?
Red LED not red but white after I take the photo
Are there reasons to use colour filters with digital cameras?

\$\endgroup\$
4
  • \$\begingroup\$ Yes, I read that the light pollution filter was constructed through multiple applications of liquid with reflective index and crystals(not sure what the crystal was doing) and linear polarizers, so in the sense that it's absorption rate was wavelength specific. I'm wondering if the color filter was constructed the similar way such that their behaviors are non linear and not replicable through digital processing, or if they were just colored glass with ions or plastics with pigments, in which its effect was not necessary. This can make a difference with the color correction filter in stadium. \$\endgroup\$ Commented Jul 6, 2023 at 18:55
  • 1
    \$\begingroup\$ @ShoutOutAndCalculate: you are missing the point. There used to be sodium lights used for street lighting in many areas. The strong yellow that they would produce would show up in the red and green (and probably blue) channels. If you got a narrow filter that eliminated the sodium lines you could see the red and green of the object you are interested in. This filter is linear in the sense that it reduces each wavelength by an amount that depends only on the wavelength. The important thing is that it cuts light you don't want more than light you want. \$\endgroup\$ Commented Jul 10, 2023 at 3:43
  • \$\begingroup\$ @RossMillikan There still are many sodium streetlights in various locales, just not as many as there used to be. But there are still plenty out there. \$\endgroup\$
    – Michael C
    Commented Jul 10, 2023 at 5:14
  • \$\begingroup\$ @MichaelC Also, halide, mercury, sodium, fluorescent, which covered most of the indoor and outdoor sports stadium. What's more, (that was being largely ignored, and why this had become more of the concern) the LED diode was also mostly single wave spectrum, and in the commercial usage it were commonly tuned to specific wavelength("temperature"), often combined with Daylight (5600k) and Soft White(2700K), which was very different from a incandescent light bulb that emits a broad thermal spectrum. I suspect a combination of optical filters might be useful. \$\endgroup\$ Commented Jul 10, 2023 at 10:56
1
\$\begingroup\$

Yes, DSLRs' physical color and light pollution filters can achieve things that can't be done in post.

When it comes to color filters, they play a fascinating role in photography. These filters can selectively absorb specific wavelengths of light while allowing others to pass through. As a result, they have a direct impact on the overall color balance of an image. For instance, a red filter will absorb blue and green light, letting only red light reach the sensor. Consequently, the image will exhibit a prominent reddish tone, which wouldn't be possible without the filter.

Post-processing techniques offer a wide range of adjustments, including color balance, but they cannot replicate the exact effects of a physical color filter. The reason lies in the fundamental difference between the two. A physical color filter interacts with the light before it reaches the camera's sensor, modifying its composition. On the other hand, post-processing manipulates the captured light data after the sensor has recorded it.

Turning our attention to light pollution filters, they aim to counteract the adverse impact of artificial light sources, such as streetlights. By absorbing specific light wavelengths emitted by these sources, light pollution filters facilitate the visibility of faint celestial objects in the night sky, including stars and galaxies.

While post-processing techniques can help reduce the impact of light pollution through methods like gradient removal, this process tends to be time-consuming and meticulous. In contrast, a dedicated light pollution filter offers the advantage of automatic filtration, effectively eliminating unwanted light pollution. Additionally, it can enhance the overall contrast and clarity of the resulting image.

Here are some examples of how color filters and light pollution filters can work differently than post-processing:

Color filters can be used to create artistic effects. For example, a red filter can be used to create a more dramatic sunset or a blue filter can be used to create a more surreal landscape. You can create similar effects in the post, but controlling the results will be more difficult.

Light pollution filters can improve the visibility of faint objects in the night sky. For example, a light pollution filter can be used to make it easier to see the Milky Way or to capture the details of a nebula. In the post, you can reduce the amount of light pollution in an image, but you may not be able to see as many faint objects.

In my personal experience, I have found that color and light pollution filters can be very effective in achieving certain effects that are difficult or impossible to recreate in a post. However, it is important to experiment with different filters and settings to find what works best for you.

\$\endgroup\$
2
  • \$\begingroup\$ This reads more like an article than an answer. I'm sure it's just personal style, but there's a risk people might think a Large Language Model played a significant part in its composition. \$\endgroup\$ Commented Jul 12, 2023 at 7:47
  • \$\begingroup\$ "For instance, a red filter will absorb blue and green light, letting only red light reach the sensor." No, a red filter will let less green and blue light through than red light. When we use red filters with B&W film all objects not red do not appear to be pure black, they're just darker shades of gray than they otherwise would be. \$\endgroup\$
    – Michael C
    Commented Jul 15, 2023 at 2:49
0
\$\begingroup\$

Adjusting colors in photoshop is not a lossless operation. Compared to analog (glass, during capture), digital alteration permits (though not always guarantees) nastiness like banding, stretching, compression, posterization, noise, blowout, and softness. The more dramatic the in-post effect, the more info is lost. This is especially true shooting JPEG or non-raw video.

All those image quality issues are avoided if you nail the capture to reflect the desired output as closely as reasonable. Glass simply leaves yourself more image information to tweak and manipulate later by avoiding lossy operations to get to your creative starting point in post. It's not just when overall exposure hits a wall that degradation occurs; it happens when any sub-channel (typically RGB) runs out of room on the left or right of the historgam, and adjusting colors in post needlessly moves those channels around.

\$\endgroup\$
0
\$\begingroup\$

Within the dynamic range of the sensor there is usually minimal difference.

But badly aligned channels can leave you clipping the highlights on one channel while clipping the shadows on another channel. No "right" exposure is possible--because color filters alter values before clipping they can fix this.

Also edge cases where a color filter has a response you can't replicate with bit-twiddling--thinking of some deep blue filters that mostly block red but start to pass again right before crossing into IR. No reliable way to distinguish single-wavelength (aka LED) 625nm from 725nm in RGB. Yeah, yeah, the green channel response may be a little different, but hard to manufacture a clean sharp cutoff out of that "little difference", especially in the presence of significant noise.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.