7
\$\begingroup\$

I am looking to purchase my first hydrogen-alpha filter to use for day solar (Sun) photography as I just recently purchased a SkyGuider Pro device. However, upon several bouts of research I am at a loss as to what exactly I am looking for and consequently, how to use what I am looking for. Here is my confusion:

  1. In doing a search for hydrogen-alpha filters online, I come across several variants whose price ranges vary dramatically (from <= $100 to >= $10,000). Some products are just lenses and some look like computerized components. This leaves me puzzled as to what I am looking for exactly and needless to say anything that cost over $1,000 is out of the question for my budget.

  2. Additionally, I am unsure if a hydrogen-alpha filter is supposed to be used in conjunction with a regular solar filter or if the hydrogen-alpha filter alone is enough (I suspect it is not but I am not sure).

If it helps, the equipment I plan on using is:

  1. A Canon EOS Rebel T7i
  2. A Vivitar 500mm lens (maybe with a teleconverter too but haven't decided on that fully yet)

I already have a range of solar filters for my various lenses. The left-half of the image below is what I strive to achieve if possible.

Can someone please point me to a guide or in the right direction so that I purchase the right thing as long as it is not cost-prohibitive?

Thank you.

Hydrogen Alpha Filter with Sun

\$\endgroup\$

1 Answer 1

8
\$\begingroup\$

Ha Solar Telescopes

While Hydrogen alpha (Ha) solar telescopes such as those from Lunt Solar Systems or the Meade Coronado line do filter out all wavelengths except Ha, they also have a substantial energy rejection filter because the Sun's energy is overwhelming.

Normal Ha Filters

The Ha filters for astronomy (just a filter ... no energy rejection) are not meant for use looking at the Sun. Those are narrowband filters designed to enhance imaging of Ha objects such as emission nebula that glow in Ha (such as the Horsehead nebula, Lagoon nebula, Rosette nebula) and can also be used when imaging galaxies to bring out the nebulae within distant galaxies. These filters are meant for imaging deep-sky objects at night. DO NOT use these filters for viewing the Sun.

Adapting a Camera for Ha Solar Imaging

Daystar Filters not only makes Ha telescopes, they also make gear to fit onto telescopes to make them suitable for Ha work ... including a product they call the Quark.

In particular, they make something called the Daystar Camera Quark. This fits between the lens and body to allow only Ha wavelengths to pass (and correct for the back-focus distance change).

The Bandpass Problem

It comes in two variants... a "chromosphere" model and a "prominence" model. The difference has to do with the width of the band pass. A wider band pass will show stronger prominences ... but the "surface" (chromosphere) of the Sun will be very bright ... mostly washed out. A narrower band pass will show more surface detail in the chromosphere ... but prominences are weaker.

Most high quality Ha solar images are actually a composite image ... one of the chromosphere, the other of the prominences, and they are overlaid. My Ha Solar telescope (a Lunt) has two tuning "etalons". When both are installed I can tune them for great surface detail. But to shoot the proms, I remove one etalon (and replace it with a spacer) and shoot the prominence data ... then merge the two sets of images.

Issues using traditional DSLRs for Ha Solar Imaging

If you want to imaging using your camera and lens, then use the Daystar Camera Quark. However ... I first attempted to imaging using my Canon 60Da through the Lunt and found the images taken with the DSLR were not great. Much of this has to do with "seeing" issues (atmosphere instability). To image the Sun, you'd normally collect some video frames (say... 30-60 seconds worth) and then run that through stacking software. It turns out having non-lossy video frames and a high frame rate really improve the results ... but DSLRs aren't great at either of these.

I was personally frustrated with the results and approached other astrophotographers who do Ha solar imaging ... and they put me on the path to using a dedicated solar imaging camera. The favorite camera for Ha solar work changes every few years, but it tends to be a camera with a small imaging chip, monochrome (not color ... you're only capturing one wavelength so a color camera with a Bayer matrix means you are wasting 75% of your pixels), and the ability to capture uncompressed (non-lossy) frames at high speed (typically the camera would have a global shutter instead of the more traditional rolling shutter).

Other Options

Ultimately I switched to a ZWO ASI174MM camera. This camera is optimized for Solar and Lunar imaging because it has a "global" shutter. Most digital cameras use a "rolling" shutter. The camera doesn't actually image the entire frame at the same time... it scans row by row. A "global" shutter scans and reads out the entire frame at the same time. It is significantly faster (and also a more expensive design). But with the global shutter, that camera can shoot at 164 frames per second at full resolution (but it is a small chip). (I should note that the camera has no on-board storage ... so it's critical that it be connected to a fast computer via USB 3 and the computer typically needs extremely fast storage (it will collect several gigabytes of data in just a few seconds ... if the computer isn't fast enough it will have nowhere to put the data and the frame rate will drop.)

The camera is used in conjunction with a dedicated Ha Solar Telescope. In my case, it's a Lunt 80mm telescope with two tuning etalons (often referred to as a "dual stack" because the etalons are stacked one after the other). The "etalon" is responsible for tuning the specific wavelength of light that can pass through the optical system (and they are very expensive).

Since I can remove one of the etalons (and insert a spacer) it means the scope can act as both a "prominence" and "chromosphere" scope.

The versatility of the scope combined with a high-speed monochrome camera really improved the results.

Eclipse Photography

For eclipse photography, please visit Fred Espinak's website: http://mreclipse.com/ Fred is a retired NASA physicist who did (actually still does ... even though retired) the eclipse predictions. Both the path data for the 2017 and the 2024 eclipses are all his work. But he's also possibly the world's foremost expert in eclipse photography and his website is a wealth of information.

During a total solar eclipse, there's a few moments both before and after totality when the Sun is nearly completely eclipse and you can just see the chromosphere. Here's a link to one of my photos from 2017 ... this was taken without any filter at all (and this is really the only time you can capture a glimpse of the chromosphere without any filter at all on the camera).

Here's an example I shot during the 2017 Total Solar Eclipse:

Chromosphere during Eclipse

Note, no filters (Ha or otherwise) were needed to capture this. But it is only briefly visible in the moments before and after totality.

The get the timings right, I used special software.

For macOS, there's an app called Solar Eclipse Maestro. http://xjubier.free.fr/en/site_pages/solar_eclipses/Solar_Eclipse_Maestro_Photography_Software.html

On Windows there's an app called Eclipse Orchestrator.
http://www.moonglowtechnologies.com/products/EclipseOrchestrator/index.shtml

Color vs. Mono Camera

It is possible to produce a color image from monochrome sensor by taking three exposures (often four), each having a different filter; red, green, and blue ... and often a fourth using a luminance filter. A luminance filter is a filter that allows for the full visible spectrum to pass ... but blocks both UV and IR. The luminance channel has the highest detail and the color channels are used to combine to produce a full-color image. Using tools such as photoshop, it is possible to use the "channel" view and import an image into each of the correct channels. When you view the combined result you will see a full-color image.

While that works well ... it also means taking more exposures and it only works well for types of photography where the subject isn't moving.

A one-shot color camera starts uses a Color-Filter-Array (CFA) in front of the sensor. A Bayer Mask is the most common type of CFA. This "tiles" the front of the camera sensor with a mosaic of filters ... each covering just one photo-site. 50% of the filters are green (human eyes are mostly sensitive to green) and 25% for both blue and green (each). A 2-pixel by 2-pixel square of the sensor would have one red corner, two green corners, and one blue corner. Also the most common arrangement for the extreme upper-left corner of the image (the 2x2 box) will be RGGB:

RG
GB

Not all cameras use this pattern, but most do. And if you keep repeating the pattern, you'd get:

RGRGRGRGRG...
GBGBGBGB...
RGRGRG...
GBGB...
RG...

etc. (where this represents the colors in the upper-left corner). To derive color (suppose you want the color for row 3, column 3), that green pixel has two blue "neighbors" above and below, and two "red" neighbors located left and right. The combined RGB color is derived by using the green component of the pixel itself, but the red component is the average of its two red neighbors and the blue is the average of its blue neighbors. This only creates a problem for pixels at the edge of the frame (which are missing some neighbors). BUT... this is why you might notice the technical description of your camera has a certain number of ACTUAL pixels ... but a slightly smaller number of USABLE pixels (there are pixels around the edge that exist only to supply their respective RGB color component to the neighboring row or column.) This allows for a full-color image in just one shot.

If you use such a sensor to perform narrow-band imaging -- such as Hydrogen-alpha solar imaging where the only photons that can pass through the filter are "red" photons with a wavelength at or extremely close to 656.28nm ... then the "blue" and "green" pixels on the sensor don't collect any light. You are effectively imaging with just 1/4 of the sensor. This is why monochrome sensors are preferred for Ha solar imaging.

Monochrome sensors cost more ... because the vendors buy the chips from their suppliers (such as Sony, Kodak, Panasonic, etc.) as one-shot color sensors (they have the Bayer mask already applied). They have to carefully remove the Bayer mask to revert the camera back to monochrome ... and that extra labor is the reason they charge more.

For Ha solar imaging, I strongly prefer monochrome sensors. For other types of deep-sky astrophotography, I usually prefer 1-shot color sensors. You can get a filter wheel and use a monochrome camera (with the filters) do obtain enough data to produce color images ... and can even by narrowband images to enhance specific parts of the spectrum without over saturating the other parts. While I do have the equipment to do this, my weather here isn't reliable. Shooting an hour's worth of data on each color channel may mean that I run out of time and don't finish collecting data before the clouds move in... so I opt for one-shot color cameras for my deep-sky astrophotography.

Processing

When acquiring data for Ha solar imaging, getting enough light isn't the problem (it's the Sun ... so there's plenty of light) but there's enough atmospheric disturbances that distort the image ... it's difficult to get a really "clean" frame (free of distortions) because of the atmosphere.

To acquire data, the camera captures the Sun as video frames... using a very high frame rate. Most of those frames will show distortions, but a small percentage will be clearer. I typically capture about 30 seconds worth of frames ... at the fastest frame rate the camera can manage. This is sometimes referred to as "lucky imaging" ... and it's best to do on days when the "seeing" quality is good (atmosphere is calm ... not near a cold-front, a warm-front, or the jet-stream and in an area with smooth laminar air-flow (flat land ... or a large body of water).

I use the ASI174MM and even though it has been out for a number of years, I'm still having difficulty finding a better sensor for Ha solar imaging. Most CMOS color sensors use a "rolling shutter" where the camera images and reads-out row by row. This sensor uses a "global shutter" where all rows image and read out in parallel.

This high-speed "global shutter" increases my "luck" in "lucky imaging".

I capture the data as .SER format files (these are the camera's version of full-resolution RAW video frames ... rather than a compressed video format that doesn't supply full resolution on each frame.) FireCapture is free, popular, and can be used to do this.

The data is stacked using AutoStakkert -- free planetary stacking software only available on Windows. As I am a mac user, I use virtual-machine software to run Windows on my Intel-based mac (this won't work on Apple's new "M1" Apple Silicon chip macs). The process involves opening the .SER file. This is done by opening the .SER file, 'analyzing' the frames, telling it to stack only the best frames (frames with better than average image quality ... this might be the best 5%, 10% ... or maybe the best 50% if the quality is high enough). I typically tell AutoStakkert to apply sharpening and to 'Resample 2x' (in the advanced options). AutoStakkert needs to identify alignment points (it can place these on your frames automatically ... basically it's looking for areas of identifiable contrast. The output is the fully-stacked and sharpened image ... but still monochrome.

It is usually a good idea to capture two sets of data... one set at an exposure that capture good contrast on the disk, but the prominences will be very weak. Another set at a higher exposure where the prominences will be well-exposed and easily recognized... but the disk will be over-exposed. Some careful work in Photoshop can be used to re-combine these images into one. (Often a tiny amount of rescaling is needed ... about 1%).

After converting the monochrome (single-channel image) to RGB Color (three-channel image), you can go into Photoshop and pull-down the strength of Blue channel and slightly pull-down the strength of the Green channel. Green and Red make either yellow or orange depending on how much green vs. red you've blended ... so this produces the yellow/orange Sun that you're accustomed to seeing.

Here is a completed image that I shot last fall.

Sun in Ha captured 8-Sep-2021

\$\endgroup\$
10
  • \$\begingroup\$ Thank you for this information. This helps clear up my confusion and also shows there is more to solar photography for me to learn about than just simply "point and shoot with a filter". My ultimate goal is to master this skill before the US experiences its next Solar eclipse. \$\endgroup\$
    – osswmi
    Commented Nov 16, 2020 at 3:51
  • \$\begingroup\$ I've added some info on Eclipse photography. I imaged the eclipse in both Ha and white-light solar filters (dual scopes) ... but the images of most eclipse phenomena (Bally's beads, the Diamond Ring effect, the Chromsophere, the bracketing sequences of the corona during totality, etc.) were shot using the "white light" filtered scope (no the Ha scope) and I pulled the filter off a couple of seconds before totality and re-attached it just a few seconds after totality. \$\endgroup\$ Commented Nov 16, 2020 at 4:07
  • \$\begingroup\$ As a follow up, I have seen a few posts over the past few weeks about UV/IR filters being needed once you exceed 80mm and now that I have the Chromosphere model, I want to make sure I'm careful not to damage it. Therefore, do I need to look into purchasing a UV/IR filter if I plan on using that "Vivitar 500mm lens" (or really anything beyond 80mm lens) that I mentioned in my original post to use with either my existing DSLR or the ASI174MM camera you mentioned that I plan on purchasing as well in the future? \$\endgroup\$
    – osswmi
    Commented Apr 12, 2021 at 13:15
  • \$\begingroup\$ I got a response back from someone at DayStar about the additional filters and they informed me that I will in fact NOT need an additional filter after all for the Vivitar 500mm lens mentioned in the original post as the aperture of the lens is less than 80mm (it is just under ~70mm). \$\endgroup\$
    – osswmi
    Commented Apr 12, 2021 at 23:24
  • \$\begingroup\$ @osswmi that makes sense. I have a "White Light" solar wedge ... a type of diagonal semi-silvered mirror that goes at the back of a telescope. Most light passes through and gets dumped on a heat-sink, but a tiny amount is reflected up into the eyepiece. But it has a similar rule w.r.t. aperture limits. If the aperture is too large then the scope would collect more heat than it can handle and results in damage. \$\endgroup\$ Commented Apr 14, 2021 at 1:12

Not the answer you're looking for? Browse other questions tagged or ask your own question.