7
\$\begingroup\$

I read/saw several articles/videos about how to photograph the moon and a lot of them use ND filters without really explaining why. What is the point in using an ND filter and the choosing a longer shutter speed? Could I not simply just not use the filter and use a faster shutter speed - as long as the sensor is exposed to the same amount of light?

Does the filter do something more that I'm missing? AFAIK it just darkens the scene so longer shutter times can be used to e.g. get a nice flowing water effect in daylight etc.

\$\endgroup\$
6
  • 7
    \$\begingroup\$ Could you link to any of these tutorials? It's also the first I've heard of using a ND filter to photograph the moon - seems counterintuitive to me \$\endgroup\$
    – osullic
    Commented May 21 at 11:39
  • \$\begingroup\$ On the topic of moon photography, there's actually some pretty good information in this question: Why do moon shots bring out the worst in telephoto lenses? \$\endgroup\$
    – osullic
    Commented May 21 at 11:43
  • \$\begingroup\$ @osullic agenaastro.com/articles/product-types/… for example. I fail to see how this would physically work: "ND filters preserve the subtle lunar colors while making it easier and more comfortable to detect low-contrast details.". Thanks for the article link, will read it later. That one didn't show up on the site search! \$\endgroup\$
    – Tom
    Commented May 21 at 12:42
  • 2
    \$\begingroup\$ On first glance, that link seems to be more focused on viewing the moon through a telescope. In that case, a ND filter kind of makes a bit more sense, because, of course, the moon is lit by the sun, and actually reflects a lot of light itself. The ND filter would act to reduce this brightness/glare, similar to using sunglasses to view a daytime scene. However, when photographing the moon, the filter isn't necessary - as you say, the camera's exposure duration can handle the moon's brightness easily. \$\endgroup\$
    – osullic
    Commented May 21 at 13:02
  • 3
    \$\begingroup\$ Using an ND filter on a telescope makes sense. A friend reduces the aperture on his 280 mm (11 inch) diameter telescope to view the moon as the brightness is overwhelming without stopping down. \$\endgroup\$
    – qrk
    Commented May 21 at 14:13

4 Answers 4

15
\$\begingroup\$

Summary

In my opinion there is absolutely no point in using an ND filter with moon photography. The tutorial you mention uses the ND filter to reduce the glare on a human eye. By darkening the image, just as using sunglasses, you perceive a bit more of dynamic range then.

As you can just reduce the shutter speed in a camera, as you said, there is no use of this in photography.

Some scenarios for an ND filter

The idea with an ND filter is to be able to use a longer shutter speed. This can be useful in a number of scenarios:

  • You want to use strobes in bright daylight and with open aperture. This will force you camera to use a shutter speed shorter than your flash sync speed. So to pull this off without resorting to other options like High Speed Sync (HSS), you can also bring down the ambient exposure via an ND filter.
  • You want to create a dramatic feel with a clouded sky or moving water and create a dreamy atmosphere. With an ND filter you can prolong the exposure so that water or clouds move and become softer.
  • You have things moving through your image at relatively high speed and don't want them to show up in you image. If you use an ND filter to prolong your exposure long enough, moving things might just be a ghostly haze or even not show up at all.

Why an ND filter is a bad idea here

What you usually never want is to prolong your exposure longer than needed if you shoot something that is moving. And if you just want to shoot the moon, possibly via a telephoto lens, prolonging that exposure will result in unsharp images, due to the movement of the moon within the frame - which is surprisingly fast.

So just to be able to get a sharp image with an ND filter, you would now need some gear to automatically follow the moon's movement like a star tracker.

What an ND filter does not do

Note that just using an ND filter does not magically enhance anything. The filter is applied to all levels of brightness the same way. So, there is no added quality whatsoever. It also does not change the dynamic range of you camera (how much detail is in the range from brightest to darkest parts). Quite the opposite: If the filter is not top grade, it might even add some color cast to the image, introducing reflections and flares etc.

Other types of ND filters

There are gradual ND filters that have a gradient in filter density, which an be used if you have a brighter part of the image and want to balance that with some darker part. I cannot imagine any use for them when photographing the moon alone, however it could be used to dim the moon to give more detail to the landscape the moon is over.

\$\endgroup\$
9
  • 4
    \$\begingroup\$ @joojaa Yes the moon does move quickly when you look at it through a telephoto lens. The first time I took some moon shots (with just a 300mm lens), I was surprised just how quickly it moves. \$\endgroup\$
    – John
    Commented May 22 at 8:19
  • 1
    \$\begingroup\$ @joojaa atmospheric seeing also reduces the sharpness, so the shorter the exposure time the more detail you can recover. \$\endgroup\$
    – Davidmh
    Commented May 22 at 11:01
  • 3
    \$\begingroup\$ @joojaa trying to do a multiple exposure HDR of the moon through bare tree branches in winter fails because the moon moves enough while adjusting the exposure. That's seconds. Simplified calculation (order of magnitude, because the moon is also going round the earth): The moon is about 0.25° across, and the earth rotates 360° in 24 hours or 15° (60 moon widths) per hour. So the moon moves its own width in the sky in a minute, or a good few pixels per second with a longish lens. \$\endgroup\$
    – Chris H
    Commented May 22 at 13:09
  • 2
    \$\begingroup\$ @ChrisH the moon is 29′20″–34′6″, or about 0.5° across, not 0.25°. At 15°/hr = 15 arcmin/min, it moves its own width in 30 seconds. \$\endgroup\$
    – scottbb
    Commented May 22 at 16:48
  • 2
    \$\begingroup\$ @scottbb I trusted that figure because it came from a NASA page - but Google's excerpt on a search for the angular size of the moon gives the radius without saying so. But then if it's 30 arcmin wide and moving at 15 arcmin/min, that makes 2 minutes to travel its own width. Still, while my numerical conclusion was wrong, my practical one still holds \$\endgroup\$
    – Chris H
    Commented May 22 at 18:59
8
\$\begingroup\$

With DSLR cameras, there is no need to use any ND filter when photographing the moon.

The article you linked to was talking about viewing the moon with a telescope. Viewing telescopes don't have variable apertures and shutters. They are meant to collect as much light as possible. The only filters that make sense on telescopes are spectral filters in order to block out most visible light to see certain emission spectra (such as h-alpha filters when observing the sun).

When viewing the moon optically, its exposure value is about 12–15 (depending on phase). This is a similar exposure to photographing a scene on a bright sunlight day, maybe with light cloud cover, or a subject in light shadows. That is to say, photographically, there's a lot of light.

But the big difference when viewing the moon, is that it's contrasted against a very dark sky, so the moon appears almost blown out or overexposed compared to its immediate surroundings. That's why an ND filter is useful when optically viewing the moon, in order to not blow out the brightest parts, and increase its apparent contrast.

The ND filter is useless photographically, because there's no depth of field composition penalty to stopping down the lens's aperture (up until the point of diffraction-limited aperture), and speeding up the shutter speed to reduce exposure always helps maintain sharpness when photographing a (somewhat) slowly moving object.

Additionally, every additional piece of glass in the optical stack has an impact on image quality or sharpness. Flat glass (such as the ND filter) is especially problematic because it helps propagate reflections. This is mitigated by using the highest-quality multi-coated filter glass, but the problem is impossible to entirely eliminate. Why add glass when it's completely unnecessary, and the only benefit it adds (darkening the scene) can be achieved even better with in-camera controls?

\$\endgroup\$
7
  • \$\begingroup\$ Of course on a DSLR you can also turn down the sensor's gain ("ISO"); on a film camera you'd have to load slower film to access the equivalent parameter. It's worth setting the ISO manually and bracketing the exposure as metering algorithms aren't great with such high contrast. with a wide lens (think night landscape, but they're hard to get right anyway), depending on your camera model, the moon may even be smaller than the spot in spot metering. Moon photography is a good exercise when the days are short and clear winter night can have better seeing due to the low humidity \$\endgroup\$
    – Chris H
    Commented May 23 at 7:56
  • 1
    \$\begingroup\$ The best way to reduce the brightness of a telescope is not a glass ND filter, It's a cover that looks like a paint can lid with a smaller hole in it. For reflector telescopes the hole is offset to avoid the secondary mirror in the center of the tube. The light passing through the hole has no additional optics to cross. For refractor telescopes, the hole should be in the center where the optics are usually sharpest. \$\endgroup\$
    – Michael C
    Commented May 23 at 8:42
  • 1
    \$\begingroup\$ @MichaelC Absolutely, using a telescope aperture cover is generally the simplest and best. But even then, say with a 12" Dobsonian and depending on the mount, user's height, etc., fitting an ND on the eyepiece may be a lot simpler of an affair. \$\endgroup\$
    – scottbb
    Commented May 23 at 15:09
  • \$\begingroup\$ You can make an aperture cover out of cardboard. It's not rocket science. \$\endgroup\$
    – Michael C
    Commented May 30 at 14:08
  • \$\begingroup\$ @MichaelC It's nothing to do with rocket science. It's about the difficulty of covering the aperture of a big lens without bumping it out of alignment, that I'm talking about. That's all. Different cases for different situations. I never said an ND was the best option. I answered the question about how and when ND filters are used in astronomy. Have a good day. \$\endgroup\$
    – scottbb
    Commented May 30 at 22:15
5
\$\begingroup\$

Gradual ND filter would be useful to darken the sky and the moon to improve visibility of ground objects. It would be especially useful when using a long lens when preserving moon details is important since it's a big object in the frame.

Standard ND is less obvious. If there are moving clouds in the sky (or anything else moving not too quickly) you may use ND filter to prolong exposure to smudge them.

\$\endgroup\$
0
0
\$\begingroup\$

With a VERY strong graduated ND with a curve you could dim the sunlit bright side of the moon enough to take a picture where the dark side appeared as something other than not there.

\$\endgroup\$
4
  • \$\begingroup\$ Very strong indeed: you'd need OD5 to get about the same brightness, but with a really sharp edge, and that will give diffraction effects. Instead of an ND filter, with a 14-bit sensor or better you could image repeatedly such that the bright parts don't quite saturate in any image, then sum the images (aligning for apparent movement), but a 12-bit sensor doesn't quite have the dynamic range (a pixel would read zero in the area lit by earthshine when a pixel looking at the bright side saturated) \$\endgroup\$
    – Chris H
    Commented May 23 at 7:54
  • \$\begingroup\$ @ChrisH while it would seem like a lower bit depth makes it impossible to pull out things darker then it, because sensors have noise(and bias to prevent pixels from reading zero), stacking images allows you to pull out deeper dynamic range, with sufficient number of exposure \$\endgroup\$
    – Topcode
    Commented May 24 at 3:53
  • \$\begingroup\$ @Topcode I'm familiar with summing images off chip for more dynamic range. I do it for near infrared imaging in work. Doing so (by relying on probabilistic noise) when the weak signal is unlikely to register a single count while the strong is close to saturation is pushing it too far for quality images, and marginal even for showing that something is there. That's why I say it would work with a 14 bit sensor but not a 12 bit one; the contrast is about 12.3 bits. You'll see variation in the dark region, but not well enough to trust. \$\endgroup\$
    – Chris H
    Commented May 24 at 5:43
  • \$\begingroup\$ Admittedly I've only used up to about 100 images, but that's about another 6.5 bits. Great for stretching the dynamic range when your weak spots have a value of about 1-2 over the DC background. Sensor linearity is also an issue at the very bottom of the range (far worse on InGaAs sensors than Si) \$\endgroup\$
    – Chris H
    Commented May 24 at 5:48

Not the answer you're looking for? Browse other questions tagged or ask your own question.