13
\$\begingroup\$

Typical consumer cameras can capture wavelength of 390-700 nm 400-1050 nm. But why is it so difficult and expensive to produce cameras for infrared, ultraviolet, hard x-rays, etc.? The only thing which differs them are the wavelength and the energy eV.

\$\endgroup\$
5
  • \$\begingroup\$ They lose too many of them in the dark. (Actually, the optics are one reason. Glass or clear plastic makes good lenses in the visible spectrum, but, eg, ordinary glass & plastc is opaque to IR, and in the far IR region the best optic material is crystalline NaCl, but NaCl has the unfortunate habit of dissolving in humid air.) \$\endgroup\$
    – Hot Licks
    Commented Sep 8, 2014 at 19:32
  • 2
    \$\begingroup\$ BTW, ordinary digital cameras are sensitive a little way into the infrared. Try pointing your phone's camera at the transmitter end of a TV remote, and you'll see a distinctive pink or purple color. Sometimes a camera will see this same color in sunlight reflected off a shiny surface, but the human eye doesn't see it. \$\endgroup\$ Commented Sep 8, 2014 at 22:40
  • \$\begingroup\$ It looks like there are several converted cameras sold as 'ghost hunter' cameras in the $300-$400 range. When you take it all into consideration, they actually seem reasonable. Considering the cameras need to be the type that allow for the mods to be done in the first place (~$200). That's still well below the cost of any 'professional' camera (>$1000 for the body alone). \$\endgroup\$
    – krowe
    Commented Sep 8, 2014 at 23:26
  • \$\begingroup\$ Check out the NASA page for Swift's Burst Alert Telescope (BAT) and look how strange telescope designs get when you want to detect very high energy photons (15-150 keV, hard X- or gamma rays depending on who you ask). \$\endgroup\$
    – Nick T
    Commented Sep 8, 2014 at 23:38
  • \$\begingroup\$ @HotLicks Incorrect. Until 2007 Kodak made and sold high-speed infra-red (HIR) for 35mm that was usable using standard lens and filters (glass or plastic), it was sensitive from less than 700nm to 900nm. Ilford, Efke, Rollei also make/made IR film for photographic use in regular film cameras. \$\endgroup\$
    – mctylr
    Commented Sep 9, 2014 at 14:14

5 Answers 5

22
\$\begingroup\$

It comes down to market size. Where is the demand for such cameras and do the number of sales justify the production set up costs? You can get an infra red conversion to standard type DSLR cameras (eg Do It Yourself Digital Infrared Camera Modification Tutorials) and you can convert the camera to a 'full spectrum' type which takes in some ultra violet. (see Full-spectrum photography). For smaller wavelengths you'll need different sensors. These, by their specialist nature and low volume production, tend to be very expensive.

\$\endgroup\$
3
  • 2
    \$\begingroup\$ To add to this, consider the price of similar sensors with and without a Bayer filter. Sensors without a Bayer filter are much more expensive, despite the fact that adding a Bayer filter is an extra manufacturing step. Likewise, camera lenses without the standard coating that blocks UV are much more expensive. It's all about the market size. \$\endgroup\$ Commented Sep 8, 2014 at 19:18
  • \$\begingroup\$ Where is the demand? I guess about 50% of camera buyers would buy one with even a slight x-ray capability. I mean there's no need to see through thick walls. I guess good imaging through ladieswear with no extra cost would make it best seller. \$\endgroup\$
    – user136077
    Commented May 22, 2021 at 22:08
  • \$\begingroup\$ @user287001 you would also need an x-ray source for that, which would have more than just invasion-of-privacy issues \$\endgroup\$
    – Frog
    Commented May 23, 2021 at 0:47
9
\$\begingroup\$

First of all: standard CCD sensors are sensitive to wavelength far beyond 700nm. As much as I know Si-sensors are even more sensitive for near-IR light than for visible light.

Of course it changes for much larger wavelengths: One condition for light being detectable is that photons have enough energy to create a hole-electron-pair. This energy threshold is the band-gap of the particular semiconductor material (e.g. for Si: ~1.1 eV). Since photon energy is inversely proportional to wavelength ( E = h * c / lambda) there is a maximum wavelength that can be detected with a given semiconductor material (e.g. for Si: ~1100 nm).

For cameras the lens is also relevant: Most types of glass are less transparent to UV light. Lenses optimized for UV transparency are very expensive (although a cheap alternative could be plastic lenses).

\$\endgroup\$
8
\$\begingroup\$

Both your existing answers are valid, but may be taken in combination: Simple Si sensors are good for visible and NIR and are common and therefore cheap. Modifications to the imaging system are required in many cases as the IR is normally blocked because it's undesirable. See for example Canon's EOS 20Da.

Silicon sensors are fairly easily adapted to UV use by means of a phosphor coating (I wanted to try a homebrew version of this on a webcam I'd modded with a B+W CCD but never got the chance). Even X-rays use is possible with a scintillator (which is normally fibre-optic-coupled).

To go beyond ~1µm further into the IR requires other semiconductors - which are expensive. InGaAs is a popular choice, but is ridiculously expensive as you say - but that's not surprising as you need dedicated production facilities. InGaAs and other NIR cameras are also regarded as military technology for the purposes of US export regs (which are also imposed on many NATO countries in effect); this adds cost to the camera manufacturer in terms of compliance.

Cameras which have any sensitivity at all to thermal radiation, or which are made from narrow bandgap semiconductors, will need significant cooling to remove thermal noise that could be greater than the image you're trying to measure. That often means a Dewar of liquid nitrogen (material cost + operating cost). There are newer technologies (even uncooled) coming on the market - in particular for thermal imaging, but the resolution is much less than for Si CCD or CMOS sensors.

\$\endgroup\$
2
  • 2
    \$\begingroup\$ Your information is a little dated. Bolometer-type thermal imagers with VGA resolutions (640x480 and up) are becoming more and more available, and the prices are dropping. They can be cooled or uncooled, with the coolers being either Peltier devices or small motor-driven refrigerators. \$\endgroup\$
    – Dave Tweed
    Commented Sep 8, 2014 at 13:54
  • \$\begingroup\$ @DaveTweed updated, thanks. I hadn't seen any over about 160x120; as my experience here is mostly Si and InGaAs, it's not surprising I was a little behind the times. \$\endgroup\$
    – Chris H
    Commented Sep 8, 2014 at 15:02
5
\$\begingroup\$

For both visible and bolometer type, the reason they are cheap is because they can leverage the economies of scale in the silicon business.

As soon as you get out into wavelengths (i.e. energies) that need other technologies (InGaAs as mentioned, InSb) you're talking 2" and 3" wafers at best, nothing like the pizza sized silicon wafers used to make chips today. Also, the transistors still have to be made of silicon, so you need a connection from each photodetector on the photo-sensitive chip to each detection circuit for that pixel on a silicon chip. If you have a megapixel imaging array, you have a million connections to make.

But wait, it gets worse. If you are depending on the photoelectric effect, say for mid-wave IR at 3-5 µm, you have to cool the camera so that you are seeing something more than the heat being generated by the camera itself! Imagine a visible camera with a brightly glowing lens and housing -- that's the world a thermal camera lives in. Cooling adds a lot of expense, and usually noise as well, since the most power-efficient coolers are refrigerator type. Peltiers can't take you down to liquid nitrogen.

Oh, and BTW, glass is not transparent to wavelengths past about 2 µm, so you need a different lens material than what the last five centuries of optics has been working on.

At the other end of the spectrum, X-ray is a pain because it's hard to deflect X-rays. They like to go right through. Big imaging arrays for medical X-rays work because there is no lens, but take a look at the mirrors on something like Chandra space telescope - the "lens" is a series of glancing angle mirrors arranged in cones.

\$\endgroup\$
0
\$\begingroup\$

Visible light is a wide-band while specialized cameras are a narrow-band sometimes with a variations as thin as 3 nanometer in lambda deviation. Such nanocomposites are indeed harder to produce and demand appropiate band filtering which furthers the whole costs of the technology itself,

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.