10
$\begingroup$

Consider these two recent JWST images of Jupiter:

enter image description here

enter image description here

Europa is shown as the black circle. In the second image, Europa's shadow can be seen near Jupiter's big red spot (white in the image, since it is in IR wavelengths).

However, there is an irregular big black "blob" easily noticeable in the first photo just below to the planet's ring. It can also be seen in the second photo, with the same size, position, shape and orientation, although not so easily noticeable.

What is that black blob? Is it a defect?

If it is indeed a defect, does it have something to do with the telescope being hit by a micrometeoroid in late May, 2022? Like as it is the hole punched in one of the mirrors? Or it does not have anything to do with that?

Also, if you click in the second image to see it in full resolution, it is noticeable that it is "peppered" with black spots and even a few "salt" white spots near the bottom. Why?

$\endgroup$
1

3 Answers 3

19
$\begingroup$

The black blob comes from a region of dead pixels on one of the NIRCam detectors, A1 (as someone on Reddit correctly identified). STScI has a display of flat field images for a number of segments; a zoomed in portion of the A1 flat shows a region marked in green with precisely the same shape, but rotated clockwise by 90 degrees. Around the green, you can see a fainter, dark red region whose outer boundary appears to correspond to the speckles surrounding the blob in the image:

Detail of flat field image from detector A1
Image credit: NASA/JWST/STScI

A similar defect can be seen in this zoomed-in region of the B2 detector flat field, although it doesn't show up in that Jupiter image:

Detail of flat field image from detector B2
Image credit: NASA/JWST/STScI

You'd typically subtract out/otherwise account for calibration frames (flats, darks, biases, etc.) before doing high-quality image processing, which can remove many artefacts. You can't get around bad pixels with the same sort of processing, but you can use a method called dithering - taking multiple images offset by a certain angular distance so that every region of interest is covered by "good" pixels in at least one image.

A few notes:

  • These flats were taken on the ground before launch - so nothing you see in them is due to exposure to space.
  • Since the bad pixels are on NIRCam, they won't show up in images from other instruments on JWST (although they, too, may have similar regions of bad pixels).
  • The micrometeoroid impact was on mirror segment C3 and isn't responsible for the blob.

I'm not positive as to what the speckles are. Some could be due to cosmic ray events, while others could simply be individual bad pixels (as many bad pixels will simply show up individually, rather than in the blobs we see in A1 and B2).

$\endgroup$
5
  • 1
    $\begingroup$ Now, I wonder why in a multibillion dollars project that took decades to finish, they delivered such a crappy camera in the telescope and they even knew that beforehand in their ground tests. However, this is a question for another day. $\endgroup$ Commented Jul 18, 2022 at 19:10
  • 7
    $\begingroup$ @VictorStafusa, if you think that's bad, you should see the flat-field image for your cell-phone camera. $\endgroup$
    – Mark
    Commented Jul 18, 2022 at 23:53
  • 1
    $\begingroup$ From jwst-docs.stsci.edu/jwst-near-infrared-camera/…, about "Snowballs": "Extreme cosmic ray impacts introduce large artifacts in near-IR detectors named snowballs that are not currently corrected by the pipeline. Most snowballs appear round. Some are elongated. Some have long tails or streaks. None of these features are properly flagged as cosmic rays by the current pipeline. Four or more dithered exposures enable the pipeline to reject these features as outliers." $\endgroup$ Commented Jul 19, 2022 at 7:23
  • $\begingroup$ the "speckles" don't correspond to the dark red region itself, but to the green pixels inside the red area. $\endgroup$
    – asdfex
    Commented Jul 19, 2022 at 8:07
  • 3
    $\begingroup$ @VictorStafusa You choose the detector manufacturing process to optimize the scientific return. Sometimes, the price of getting a bunch of really good pixels is a few ugly cosmetic defects. Those good pixels are going to deliver the science, while the defects don't get in the way very much. $\endgroup$
    – John Doty
    Commented Jul 19, 2022 at 15:26
3
$\begingroup$

From what I've read, it is from the fine-guidance sensor. Some objects' luminosity is just so great that like certain images of our Sun, when we need to blot out that luminance, we do so with similar technology so we can "see" beyond the luminous image.

As for the pixels, my educational hypothesis would be that it's simply artifacting from the sensor. But if I learn differently I will amend this answer.

$\endgroup$
2
$\begingroup$

The circle at the centre of Europa looks exactly like an overload / saturation effect known as 'black sun', which is not actually defective pixels, just pixels which have been overexposed. Other people have answered for the other large area defect.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .