31
$\begingroup$

Someone just retweeted a NASA tweet onto my timeline, and it includes two images, allegedly from the same star that was in the process of dying, taken by the new space telescope, side by side:

https://www.nasa.gov/image-feature/goddard/2022/nasa-s-webb-captures-dying-star-s-final-performance-in-fine-detail

I don't quite understand what I'm looking at though. If I understood correctly, these are two images of the same star. But, both have different colour schemes. I know stars can change 'colour' based on their type and life cycle (like blue dwarf, red dwarf), but I doubt that's what I'm looking at as the telescope is relatively new and from what I understand, those life cycles take ages.

The other option that comes to my mind is some kind of artistic freedom, like what is done when artists make images of e.g. dinosaurs and guess their colors. But that seems a bit too unscientific for NASA, so I'm expecting there to be a good reason for the difference in color here.

If this is the same star, why is the picture on the left blue with red, and the one on the right red with blue?

$\endgroup$
2
  • 12
    $\begingroup$ The one on the left is from Webb’s Near-Infrared Camera (NIRCam), while the one on the right is from Webb’s Mid-Infrared Instrument (MIRI) You can find more information about the pictures in this helpful article. $\endgroup$ Commented Jul 12, 2022 at 15:46
  • $\begingroup$ Where is your own research? Was it not obvious that both pictures resulted from translating data outside the spectrum available to humanity, and in the in unlikely event that NASA didn't, most of Earth's observatories did Post at least scientific if not lay explanations? $\endgroup$ Commented Jul 15, 2022 at 1:00

2 Answers 2

58
$\begingroup$

They are two pictures of the same object. The Southern Ring Nebula. They look different because we are looking at different wavelengths. The picture on the left is near infrared (about the range 0.7 - 5 $\mu m)$, while the one on the right is mid infrared (JWST is sensitive to up to 30 $\mu m$).

Both kinds of light are impossible to see with our eyes, therefore, by definition, they don't have a color. There is no such color as infrared. So how do we display these images on an RGB monitor? Essentially, we are entitled to choose the color we want. Scientists usually choose the color so that the image is (i) clear to read, the important features are highlighted (ii) pleasant to the eye.

In this case (but this is not a rule), the longest wavelengths have been displayed in red, and the shortest in blue. Mimicking in some way the fact that in the visible spectrum red has the longest wavelength and blue/violet the shortest.

This color coding is useful because a scientist can tell at a glance what regions of the nebula are emitting the longest and the shortest wavelengths. If one also knows what processes emit what kind of light, this gives the picture a clear and immediate meaning. As instance you can tell by looking at the left picture that the central part is mainly ionized gas (blue light, shortest wavelengths) while the external region is dust and molecular Hydrogen (longer wavelengths).

In the mid infrared image the colors are reversed, because ionized gas emits more strongly in the red part of mid infrared (thus the central region is red), while in the external region we see hydrocarbon grains that emit in the shortest wavelengths of mid infrared.

Source: NASA's live coverage of the publication of the first images of JWST

$\endgroup$
2
  • 2
    $\begingroup$ Is the longer wavelength the reason why the right picture appears much less sharp than the left one? $\endgroup$ Commented Jul 14, 2022 at 14:08
  • 4
    $\begingroup$ @EricDuminil: An interesting question. They're from different instruments, so at a glance the obvious answer would be "probably not". But the instruments are on the same spacecraft and (AIUI) share common optics and size constraints, which might force them to have approximately the same numerical aperture. Assuming that they're both also diffraction-limited (because space telescopes typically are), that could make the answer essentially "yes". $\endgroup$ Commented Jul 15, 2022 at 8:41
12
$\begingroup$

The JWST operates in the infrared spectrum, which is invisible to the human eye. Infrared has longer wavelengths than visible light.

To make visible images, we map infrared date into the visible spectrum to create images.

For the image above, the image on the left is from the NIRCam (capturing energy with wavelengths from 0.6 to 5 microns), and the one from the right is from the MIRI (capturing wavelengths from 5 to 30 microns). Credit for figure to NASA:

enter image description here

So there is actually little to overlap in the EM spectrum between the two images. Specifically why the colors are opposite in the two images has to do with the chemical composition, temperature, relative velocity, and density of the cloud. These factors determine the wavelengths of the photons emitted. From Monreal-Ibero and Walsh (2020):

NGC 3132 is known to have strong near-infrared (vibrationally excited) molecular hydrogen (Storey 1984) and CO (Sahai et al. 1990) emission, both peaked on the shell where [O I] is strong. In addition, mid-infrared rotational H2 lines have been detected (Mata et al. 2016) as well as CO (J = 3–2) emission (Guzman-Ramirez et al. 2018).

$\endgroup$
5
  • 10
    $\begingroup$ The correct term is "false color" rather than "pseudo color". Pseudo color is something different. $\endgroup$ Commented Jul 13, 2022 at 0:03
  • 1
    $\begingroup$ To elaborate on the above comment, pseudo color applies coloring to a single "band", where "band" can mean anything, such as intensity of electromagnetic radiation from a sensor that does not distinguish between wavelengths. It can also apply to altitude or pressure level at an altitude. Pseudocolor can be applied to any scalar function of two variables such as z as a function of x and y position. Other visualization options for such scalar functions include a grayscale images and contour plots. $\endgroup$ Commented Jul 14, 2022 at 12:32
  • 1
    $\begingroup$ False color on the other hand maps three different functions $z_1(x,y)$, $z_2(x,y)$, and $z_3(x,y)$ to the red, green, and blue grayscale images that form a RGB image, typically with the $z$ values scaled to be integers between 0 and 255. The functions $z_1$, $z_2$, and $z_3$ might (for example) be grayscale images from three different infrared wavelengths. We humans can't see infrared, but we can see red, green, and blue, so mapping those wavelengths to something the eye can see makes the imagery visible to mere humans. $\endgroup$ Commented Jul 14, 2022 at 12:35
  • 1
    $\begingroup$ A third option is "true color", although even that's a bit dubious. Professional photographers sometimes wait for days at a location for just the right lighting conditions so they can take the perfect photo. If you remember from seven years or so ago was a huge viral argument on the internet regarding the color of a dress. "True color" is hard to achieve. It of course is impossible to achieve for humans regarding infrared imagery. $\endgroup$ Commented Jul 14, 2022 at 12:53
  • 2
    $\begingroup$ @DavidHammen thanks for the clarification, edited. $\endgroup$
    – Connor Garcia
    Commented Jul 14, 2022 at 15:42

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .